Introducing CodeLlama 70B: A 70 billion-parameter model achieving SOTA performance in code generation.
Welcome to the latest edition of the AI in 5 newsletter with Clarifai!
Every week, we bring you the models, tools and tips to build production-ready AI!
This week, we bring you: 👇
Cross-modal search 🔎
Cross-modal search refers to the ability to search across different modalities. This will allow the user to search for images, videos, and audio using natural language and vice versa.
The following blog guides you through cross-modal search and helps you learn how to perform text-to-image search with Clarifai using both the API and the portal UI. 👇
Read the blog here.
New models in the Clarifai platform 🔥
Databricks Connect UI module 🚀
Clarifai-PySpark integration helps interact between Databricks and Clarifai for tasks related to uploading client datasets, annotating data, and exporting and storing annotations.
You can use this Databricks Connect module to:
Recommended by LinkedIn
1. Authenticate a Databricks connection and connect with its compute clusters.
2. Export data and annotations from a Clarifai app into Databricks volume and table.
3. Import data from Databricks volume into the Clarifai app and dataset.
4. Update annotation information within the chosen Delta table for the Clarifai app whenever annotations are getting updated.
Try out the module here.
AI tip of the week: 📌
Passing inference parameters to Whisper Model.
When accessing the Whisper model from the Clarifai community, you can also pass your inference parameters, such as "task", to decide whether to translate or transcribe an audio file.
The following code shows how you can translate and transcribe a Spanish audio file using Clarifai's Python SDK. Check out the code here.
Want to learn more from Clarifai? “Subscribe” to make sure you don’t miss the latest news, tutorials, educational materials, and tips. Thanks for reading!