Building a Flutter Chat App with Google Generative AI (Gemini)
Introduction
In today’s world, AI-powered applications are changing the way we interact with technology. One strong application is integrating generative AI into chat applications. In this article, I will share with you my journey to create a Flutter chat app powered by Google Generative AI.
Why Use Generative AI in a Chat App?
Generative AI, like Google’s Gemini models, is a cutting-edge technology that can understand and generate human-like text. By integrating this AI into a chat app, we can create more engaging, interactive, and dynamic conversations. The AI can handle a wide range of prompts, providing users with intelligent and contextually accurate responses.
The Google AI Dart SDK enables developers to use Google’s state-of-the-art generative AI models (like Gemini) to build AI-powered features and applications. This SDK supports use cases like:
Key Features of the App
Prerequisites to build this App
In order to use google_generative_ai package you need to have Gemini API key.
Recommended by LinkedIn
It’s recommended to use the Google AI SDK for Dart (Flutter) to call the Google AI Gemini API directly from your app only for prototyping purposes. If you plan to enable billing, it’s strongly advised to use the SDK server-side to make these API calls, ensuring the security of your API key. Embedding your API key directly in a mobile or web app, or fetching it remotely at runtime, could expose it to malicious actors.
Setting Up the Chat Interface
The core of the app is the chat interface. I have use flutter package called flutter_chat_ui. So I haven’t implement a single UI. This package provides all the necessary components to create a chat UI, including message bubbles, avatars, and more. Below is a snippet of how the chat interface is initialized:
@override
Widget build(BuildContext context) => Scaffold(
body: Chat(
messages: _messages,
onAttachmentPressed: _handleAttachmentPressed,
onMessageTap: _handleMessageTap,
onPreviewDataFetched: _handlePreviewDataFetched,
onSendPressed: _handleSendPressed,
showUserAvatars: true,
showUserNames: true,
user: _user,
),
);
In this setup, _messages is a list that holds all the messages exchanged in the chat. The onSendPressed callback is particularly important as it triggers the AI to generate content in response to the user’s message.
Integrating Google Generative AI
Here’s a simplified version of how the AI response is handled in the app:
void _handleSendPressed(types.PartialText message) {
final textMessage = types.TextMessage(
author: _user,
createdAt: DateTime.now().millisecondsSinceEpoch,
id: const Uuid().v4(),
text: message.text,
);
_addMessage(textMessage);
_generateContent(message.text);
}
Future<void> _generateContent(String prompt) async {
const apiKey = "YOUR_API_KEY";
final model = GenerativeModel(model: 'gemini-1.5-flash-latest', apiKey: apiKey);
final content = [Content.text(prompt)];
try {
final response = await model.generateContent(content);
final replyText = types.TextMessage(
author: _user2,
createdAt: DateTime.now().millisecondsSinceEpoch,
id: const Uuid().v4(),
text: response.text ?? "",
);
_addMessage(replyText);
} catch (e) {
print('Error generating content: $e');
}
}
In this code, when a user sends a message, it is first added to the chat, and then the AI model is called to generate a reply. The AI’s response is then displayed as a new message in the chat.
Below is the full code for the app. Also, you can visit my Github repo to download the full code
import 'dart:convert';
import 'dart:io';
import 'package:file_picker/file_picker.dart';
import 'package:flutter/material.dart';
import 'package:flutter/services.dart' show rootBundle;
import 'package:flutter_chat_types/flutter_chat_types.dart' as types;
import 'package:flutter_chat_ui/flutter_chat_ui.dart';
import 'package:google_generative_ai/google_generative_ai.dart';
import 'package:http/http.dart' as http;
import 'package:image_picker/image_picker.dart';
import 'package:intl/date_symbol_data_local.dart';
import 'package:mime/mime.dart';
import 'package:open_filex/open_filex.dart';
import 'package:path_provider/path_provider.dart';
import 'package:uuid/uuid.dart';
void main() {
initializeDateFormatting().then((_) => runApp(const MyApp()));
}
class MyApp extends StatelessWidget {
const MyApp({super.key});
@override
Widget build(BuildContext context) => const MaterialApp(
home: Directionality(
textDirection: TextDirection.ltr,
child: ChatPage(),
),
);
}
class ChatPage extends StatefulWidget {
const ChatPage({super.key});
@override
State<ChatPage> createState() => _ChatPageState();
}
class _ChatPageState extends State<ChatPage> {
List<types.Message> _messages = [];
final _user = const types.User(
id: '82091008-a484-4a89-ae75-a22bf8d6f3ac',
);
final _user2 = const types.User(id: '82091008-a484-4a89-ae75-a22bf8d6f3ab');
@override
void initState() {
super.initState();
_loadMessages();
}
void _addMessage(types.Message message) {
setState(() {
_messages.insert(0, message);
});
}
void _handleAttachmentPressed() {
showModalBottomSheet<void>(
context: context,
builder: (BuildContext context) => SafeArea(
child: SizedBox(
height: 144,
child: Column(
crossAxisAlignment: CrossAxisAlignment.stretch,
children: <Widget>[
TextButton(
onPressed: () {
Navigator.pop(context);
_handleImageSelection();
},
child: const Align(
alignment: AlignmentDirectional.centerStart,
child: Text('Photo'),
),
),
TextButton(
onPressed: () {
Navigator.pop(context);
_handleFileSelection();
},
child: const Align(
alignment: AlignmentDirectional.centerStart,
child: Text('File'),
),
),
TextButton(
onPressed: () => Navigator.pop(context),
child: const Align(
alignment: AlignmentDirectional.centerStart,
child: Text('Cancel'),
),
),
],
),
),
),
);
}
void _handleFileSelection() async {
final result = await FilePicker.platform.pickFiles(
type: FileType.any,
);
if (result != null && result.files.single.path != null) {
final message = types.FileMessage(
author: _user,
createdAt: DateTime.now().millisecondsSinceEpoch,
id: const Uuid().v4(),
mimeType: lookupMimeType(result.files.single.path!),
name: result.files.single.name,
size: result.files.single.size,
uri: result.files.single.path!,
);
_addMessage(message);
}
}
void _handleImageSelection() async {
final result = await ImagePicker().pickImage(
imageQuality: 70,
maxWidth: 1440,
source: ImageSource.gallery,
);
if (result != null) {
final bytes = await result.readAsBytes();
final image = await decodeImageFromList(bytes);
final message = types.ImageMessage(
author: _user,
createdAt: DateTime.now().millisecondsSinceEpoch,
height: image.height.toDouble(),
id: const Uuid().v4(),
name: result.name,
size: bytes.length,
uri: result.path,
width: image.width.toDouble(),
);
_addMessage(message);
}
}
void _handleMessageTap(BuildContext _, types.Message message) async {
if (message is types.FileMessage) {
var localPath = message.uri;
if (message.uri.startsWith('http')) {
try {
final index =
_messages.indexWhere((element) => element.id == message.id);
final updatedMessage =
(_messages[index] as types.FileMessage).copyWith(
isLoading: true,
);
setState(() {
_messages[index] = updatedMessage;
});
final client = http.Client();
final request = await client.get(Uri.parse(message.uri));
final bytes = request.bodyBytes;
final documentsDir = (await getApplicationDocumentsDirectory()).path;
localPath = '$documentsDir/${message.name}';
if (!File(localPath).existsSync()) {
final file = File(localPath);
await file.writeAsBytes(bytes);
}
} finally {
final index =
_messages.indexWhere((element) => element.id == message.id);
final updatedMessage =
(_messages[index] as types.FileMessage).copyWith(
isLoading: null,
);
setState(() {
_messages[index] = updatedMessage;
});
}
}
await OpenFilex.open(localPath);
}
}
void _handlePreviewDataFetched(
types.TextMessage message,
types.PreviewData previewData,
) {
final index = _messages.indexWhere((element) => element.id == message.id);
final updatedMessage = (_messages[index] as types.TextMessage).copyWith(
previewData: previewData,
);
setState(() {
_messages[index] = updatedMessage;
});
}
void _handleSendPressed(types.PartialText message) {
final textMessage = types.TextMessage(
author: _user,
createdAt: DateTime.now().millisecondsSinceEpoch,
id: const Uuid().v4(),
text: message.text,
);
_addMessage(textMessage);
_generateContent(message.text);
}
Future<void> _generateContent(String prompt) async {
const apiKey = "YOUR_API_KEY"; //replace your api key
final model = GenerativeModel(model: 'gemini-1.5-flash-latest', apiKey: apiKey);
final content = [Content.text(prompt)];
try {
final response = await model.generateContent(content);
final replyText = types.TextMessage(
author: _user2,
createdAt: DateTime.now().millisecondsSinceEpoch,
id: const Uuid().v4(),
text: response.text ?? "",
);
_addMessage(replyText);
print(response.text);
} catch (e) {
print('Error generating content: $e');
}
}
void _loadMessages() async {
final response = await rootBundle.loadString('assets/messages.json');
final messages = (jsonDecode(response) as List)
.map((e) => types.Message.fromJson(e as Map<String, dynamic>))
.toList();
setState(() {
_messages = messages;
});
}
@override
Widget build(BuildContext context) => Scaffold(
body: Chat(
messages: _messages,
onAttachmentPressed: _handleAttachmentPressed,
onMessageTap: _handleMessageTap,
onPreviewDataFetched: _handlePreviewDataFetched,
onSendPressed: _handleSendPressed,
showUserAvatars: true,
showUserNames: true,
user: _user,
),
);
}
Conclusion
Integrating Google Generative AI into a Flutter chat app opens up a world of possibilities for creating more interactive and intelligent applications. This project is just a glimpse of what can be achieved when AI is combined with modern app development frameworks like Flutter.