This AI newsletter is all you need #43

This AI newsletter is all you need #43

What happened this week in AI by Louie

This week, the competition in the race for Generative AI and LLMs continued to intensify. Amazon announced its foray into the space by hosting models from Amazon, Stability.AI, and AI21, and promoting the specifications of its in-house chips designed for inference and training. Elon Musk also entered the race by hiring top talent from DeepMind, including Igor Babuschkin and Manuel Kroiss, and purchasing 10,000 GPUs to build a "TruthGPT." Microsoft's Bing, which is based on GPT-4, increased pressure on Google to release LLM-powered search options (set to arrive under the project codename "Magi"), especially as Samsung is reportedly considering replacing Google with Bing as the default search engine on its mobile devices.

In the Open Source LLM movement, Databricks released Dolly 2, a 12-billion-parameter, instruction-tuned language model fine-tuned on approximately 15,000 instruction-tuning examples using EleutherAI’s Pythia-12b. This release is significant as it is fully licensed for research and commercial use, unlike many previous open-source instruction-tuned models with questionable legal standing (some built on the leaked META llama model or by distilling information from OpenAI’s GPT-3.5, for example).

To cap off the week, the Auto-GPT agent AI project surpassed PyTorch in GitHub stars (now at 89k), highlighting the fast-paced and viral nature of AI products and innovations. 

- Louie Peters — Towards AI Co-founder and CEO

Hottest News

1. With Bedrock, Amazon enters the generative AI race

Amazon is entering the competition for generative AI. However, instead of building AI models entirely on its own, it is partnering with third-party startups to host their models on AWS. It introduced Amazon Bedrock, which allows for the creation of AI-powered applications using pre-trained models from AWS and startups. Users can generate images, logos, and graphics through an API.

2. AI: China tech giant Alibaba to roll out ChatGPT rival 

Alibaba, the Chinese technology giant, has announced its plan to launch an AI-powered chatbot similar to ChatGPT, called Tongyi Qianwen. This product will be integrated into Alibaba's various businesses through its cloud computing unit, although the exact timeline for its release has not been specified yet.

3. Spain’s privacy watchdog says it’s probing ChatGPT too

Spain's data protection authority, the AEPD, is conducting a preliminary investigation into OpenAI, the maker of ChatGPT, over potential violations of the General Data Protection Regulation (GDPR) of the European Union. This follows a similar move by Italy. However, there has been no order from the regulator to suspend processing by OpenAI.

4. Italy gives OpenAI initial to-do list for lifting ChatGPT suspension order

Italy's data protection regulator has given OpenAI a list of requirements to comply with GDPR and lift the ban on ChatGPT. These requirements include publishing an information notice, implementing age gating, clarifying legal basis, providing user data rights, allowing objections to data processing, and conducting an awareness campaign for Italian users.

5. Elon Musk is moving forward with a new generative AI project at Twitter after purchasing thousands of GPUs 

Elon Musk is reportedly assembling a team of AI experts to launch an AI startup that would compete against OpenAI, the research organization he helped co-found years ago, according to a report by The Financial Times.

Three 5-minute reads/videos to keep you learning

1. Top NLP papers of March

Cohere For AI's community has released a selection of NLP research for March 2023, featuring cutting-edge language models, unparalleled text generation, and revolutionary summarization techniques. This post covers an array of topics, showcasing the latest advancements in large language models and more.

2. What Are Transformer Models and How Do They Work? 

Transformers are a recent breakthrough in machine learning that have gained significant attention in recent times. This blog post provides an overview of transformer architecture, its workings, and all its components. Additionally, it offers a conceptual introduction to the technology. 

3. A new, unique AI dataset for animating amateur drawings

Meta AI developed a research demo of an AI system for animating artwork. They are releasing the animation code and dataset of 180k annotated amateur drawings to aid other AI researchers. The demo is browser-based and allows users to upload images, verify or correct a few annotations, and receive a short animation of their character.

4. A comprehensive guide to training and fine-tuning LLaMA

This tutorial covers training and fine-tuning LLaMA, a large language model. Specifically, it focuses on Lit-LLaMA, a rewritten version that can perform inference on an 8 GB consumer GPU. The tutorial also explores how Lightning Fabric is used to speed up the PyTorch code.

5. Given how good AI is at coding, is learning to code still worth it

In this Twitter thread, Amjad Masad explains the basics of building a dream MVP. He also addresses the question of whether AI will replace developers by encouraging people to engage in coding for 100 days. According to him, learning to code has become more valuable with the help of AI, with a predicted 10x ROI.

Papers & Repositories

1.OpenAssistant

OpenAssistant is a chatbot designed to understand tasks, interact with third-party systems, and retrieve information dynamically to accomplish them. It is a project aimed at providing everyone with access to a high-quality chat-based large language model.

2. Dolly 2 released

Dolly 2 is a large language model developed by Databricks, trained on their Machine Learning Platform. It has 12 billion parameters and is a causal language model based on EleutherAI’s Pythia-12b. Dolly 2 was fine-tuned using a ~15K record instruction corpus that was created by Databricks employees and released under a permissive license.

3. Teaching Large Language Models to Self-Debug

This study introduces Self-Debugging, a method that trains a large language model to debug its predicted program with few-shot demonstrations. The model is taught to perform rubber duck debugging, identifying its mistakes by explaining the generated code in natural language, without receiving any feedback on code correctness or error messages.

4. Instruction Tuning with GPT-4

This paper introduces a new approach of using GPT-4 to generate instruction-following data for LLM finetuning. The early experiments conducted on instruction-tuned LLaMA models indicate that the 52K English and Chinese instruction-following data produced by GPT-4 achieves better zero-shot performance on new tasks compared to the data generated by previous models.

5. Consistency Models

This paper proposes consistency models, a new type of generative model that achieves high sample quality without adversarial training. They allow for fast one-step generation, few-step sampling, and zero-shot data editing. Consistency models can be trained to distill pre-trained diffusion models or as standalone generative models, and outperform existing distillation techniques for diffusion models in one- and few-step generation.

Enjoy these papers and news summaries? Get a daily recap in your inbox!

The Learn AI Together Community section!

Upcoming Community Events 

The Learn AI Together Discord community hosts weekly AI seminars to help the community learn from industry experts, ask questions, and get a deeper insight into the latest research in AI. Join us for free, interactive video sessions hosted live on Discord weekly by attending our upcoming events.

1. NN Arch Seminar: A (...) Logic Gate Convolutional NN Architecture from Truth Tables

AdriBen will be presenting his paper "A Scalable, Interpretable, Verifiable & Differentiable Logic Gate Convolutional Neural Network Architecture From Truth Tables" at the Neural Network Architecture Seminar. The presentation will be streamed live from Asia, which may result in an unusual time for some viewers. The seminar will be recorded, so even if you can't attend live, you can still access the content later. Join the seminar here!

Date & Time: 25th April, 1:00 pm EST

Add our Google calendar to see all our free AI events! 

Meme of the week!

No alt text provided for this image

Meme shared by Rucha#8062

Featured Community Post from the Discord

Creativity will remain uniquely human, but as AI moves further into the creative realms, how will it change the way we write? Rucha#8062 is conducting a workshop titled "Creative Writing in the Time of AI: Experiments with AI as your Intern". The workshop will explore the aspects of the writing process that could potentially be delegated to AI, and those that should remain solely within the realm of human expression. Through this workshop, participants will gain greater clarity and understanding of how to find their own voice, while leveraging the tools and resources offered by AI. Check it out here and support a fellow community member! Share your thoughts on the topic by joining the discussion here.

AI poll of the week!

No alt text provided for this image

Join the discussion on Discord.

TAI Curated section

Article of the week

Building and Deploying a GAN Streamlit Web App on Heroku [Part 1] by Youssef Hosni

Generative Adversarial Networks (GANs) are a deep learning architecture that has gained popularity for their ability to generate realistic new data. However, building a GAN model is only the first step, as deploying it as a user-friendly web application presents a separate challenge. This article provides an in-depth discussion of the background and problem statement related to GANs. It also covers setting up the working environment, loading pre-trained GAN models and images, and building a Streamlit web application.

Our must-read articles

Computer Vision 101: Image Restoration by Gabriele Mattioli

SAM from Meta AI — The chatGPT Moment for Computer Vision AI by Puneet Jindal

If you are interested in publishing with Towards AI, check our guidelines and sign up. We will publish your work to our network if it meets our editorial policies and standards.

Job offers

Senior Software Engineer, Applied Machine Learning @SoundHound Inc. (Remote)

Senior Python Backend Engineer @Chattermill (Remote)

Manager, Data Science @Angi (Remote)

Data Scientist (3-5 years experience) @Datalab USA (Broomfield, USA)

Staff Backend Engineer @Fiddler AI (Bangalore, India/Hybrid)

Senior Machine Learning Engineer - Computer Vision @BenchSci (Remote) 

Interested in sharing a job opportunity here? Contact sponsors@towardsai.net.

If you are preparing your next machine learning interview, don’t hesitate to check out our leading interview preparation website, confetti!

No alt text provided for this image
CHESTER SWANSON SR.

Realtor Associate @ Next Trend Realty LLC | HAR REALTOR, IRS Tax Preparer

1y

Thanks for sharing.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics