Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds
Arrow up icon
GO TO TOP
Learn Mistral

You're reading from   Learn Mistral Elevating Mistral systems through embeddings, agents, RAG, AWS Bedrock, and Vertex AI

Arrow left icon
Product type Paperback
Published in Oct 2025
Publisher Packt
ISBN-13 9781835888643
Length 528 pages
Edition 1st Edition
Languages
Concepts
Arrow right icon
Author (1):
Arrow left icon
Pavlo Cherkashin Pavlo Cherkashin
Author Profile Icon Pavlo Cherkashin
Pavlo Cherkashin
Arrow right icon
View More author details
Toc

Table of Contents (14) Chapters Close

Preface 1. Strengths, Limitations, and Use Cases of Language Models FREE CHAPTER 2. Setting Up Your Own Chat 3. Managing the Model 4. Mastering Embeddings 5. Agents: From Automation to Intelligence 6. Unpacking RAG Workflows 7. Coding with Mistral 8. Building Smarter Defenses with Mistral 9. Take-Home RAG Challenges 10. Mistral on AWS Bedrock 11. Harnessing Mistral’s Power via Google Cloud Vertex AI 12. Other Books You May Enjoy
13. Index

Extracurricular

Having built a solid foundation in working with embeddings, you’re now ready to explore more advanced and creative applications. Let these exercises spark new ideas for your own projects.

  1. Experiment with different vector databases: While Pinecone is a powerful vector database, there are several others to explore, such as FAISS, Annoy, Milvus, and Neo4j’s embeddings framework. Each database has its own strengths and use cases. Try experimenting with different databases to understand their performance characteristics and how they handle large-scale vector searches.
  2. Explore different distance metrics: Cosine similarity is commonly used for comparing embeddings, but other distance metrics such as Euclidean distance and dot-product similarity can also be useful depending on the application. Experiment with different metrics to see how they affect the results and understand which metric is best suited for specific types of data.
  3. Understand...
lock icon The rest of the chapter is locked
Visually different images
CONTINUE READING
83
Tech Concepts
36
Programming languages
73
Tech Tools
Icon Unlimited access to the largest independent learning library in tech of over 8,000 expert-authored tech books and videos.
Icon Innovative learning tools, including AI book assistants, code context explainers, and text-to-speech.
Icon 50+ new titles added per month and exclusive early access to books as they are being written.
Learn Mistral
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime
Modal Close icon
Modal Close icon