如何在 Flutter 中使用 WebRTC
一 .如何在 Flutter 中使用 WebRTC 实现实时音视频通信
📱 Flutter 中使用 WebRTC 实现实时音视频通话
随着实时通信技术的快速发展,WebRTC 已逐渐成为实现视频通话和直播的一种主流技术。在 Flutter 中,你同样可以轻松调用 WebRTC 来实现跨平台的音视频实时通信。
### 📥 一、添加依赖
首先打开 `pubspec.yaml` 文件,添加 flutter_webrtc 插件:
```yaml
dependencies:
flutter_webrtc: ^0.9.48
```
执行以下命令安装:
```shell
flutter pub get
```
⚙️ 二、配置平台权限
- **Android** (`android/app/src/main/AndroidManifest.xml`):
```xml
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.INTERNET" />
```
- **iOS** (`ios/Runner/Info.plist`):
```xml
<key>NSMicrophoneUsageDescription</key>
<string>允许此应用访问麦克风以进行音视频通话</string>
<key>NSCameraUsageDescription</key>
<string>允许此应用访问摄像头以进行音视频通话</string>
```
🚀 三、WebRTC 使用示例
下面以点对点视频通话为例,展示 Flutter 如何调用 WebRTC:
#### 1. 引入库文件
```dart
import 'package:flutter_webrtc/flutter_webrtc.dart';
```
#### 2. 创建媒体流(MediaStream)
```dart
MediaStream? _localStream;
Future<MediaStream> getUserMedia() async {
final Map<String, dynamic> mediaConstraints = {
'audio': true,
'video': {
'facingMode': 'user', // 前置摄像头
}
};
_localStream = await navigator.mediaDevices.getUserMedia(mediaConstraints);
return _localStream!;
}
```
#### 3. 初始化 PeerConnection 并进行 SDP 交换(简化版)
建立点对点连接需创建两个终端的 PeerConnection 并交换SDP(Session Description Protocol)信息。实际情况可能需要服务器辅助,简化版示例如下:
```dart
RTCPeerConnection? _peerConnection;
createPeerConnection() async {
final Map<String, dynamic> config = {
"iceServers": [
{"urls": "stun:stun.l.google.com:19302"},
]
};
_peerConnection = await createPeerConnection(config);
_localStream!.getTracks().forEach((track) {
_peerConnection!.addTrack(track, _localStream!);
});
_peerConnection!.onIceCandidate = (candidate) {
// 将 candidate 发送给远程客户端
};
_peerConnection!.onTrack = (RTCTrackEvent event) {
// 远程流可以从这里获取
MediaStream remoteStream = event.streams[0];
};
RTCSessionDescription offer = await _peerConnection!.createOffer();
await _peerConnection!.setLocalDescription(offer);
// 将 offer.sdp 和 offer.type 发送给对方
}
```
接收方收到 offer 后,进行回应:
```dart
Future<void> answerOffer(String remoteSdp) async {
await _peerConnection!.setRemoteDescription(
RTCSessionDescription(remoteSdp, 'offer'));
RTCSessionDescription answer = await _peerConnection!.createAnswer();
await _peerConnection!.setLocalDescription(answer);
// 将 answer.sdp 和 answer.type 发送回对方
}
```
4. 使用 RTCVideoView 显示视频流
在你的 Flutter Widget 中放置两个视频容器:
```dart
RTCVideoView(
RTCVideoRenderer()..initialize()..srcObject = _localStream,
mirror: true,
);
// 远端视频流类似处理,srcObject设置为远程流即可
```
✅ 四、注意事项
- 实际生产环境需搭配 Signal Server(如 WebSocket 或 Firebase)进行信令交换。
- ICE servers 中 STUN 服务确保 NAT 穿透能力,如需完全穿透防火墙,可能需要 TURN 服务(如 coturn)。
- 要实现多人视频通话,建议采用 SFU 或 MCU 服务器架构。
📖 总结
通过使用 flutter_webrtc 包,我们可以快速构建实时音视频通信应用。本文简要讲解了在 Flutter 中调用 WebRTC 所需的基本概念及关键代码,使你可以轻松上手实现一个具有音视频功能的 App!
二 .Flutter WebRTC 与浏览器 WebRTC Demo 的互通指南
要让 Flutter 应用与浏览器中的 WebRTC Demo 进行通信,关键在于正确处理信令交换和确保跨平台兼容性。下面我将详细介绍实现流程:
一、信令服务器的搭建
两个平台要互相通信,首先需要一个共同的信令服务器。以下是基于 Node.js 和 Socket.IO 的简易信令服务器:
```javascript
const express = require('express');
const http = require('http');
const socketIo = require('socket.io');
const app = express();
const server = http.createServer(app);
const io = socketIo(server);
// 存储连接的用户
let users = {};
io.on('connection', (socket) => {
console.log('用户已连接:', socket.id);
// 用户加入房间
socket.on('join', (roomId) => {
socket.join(roomId);
users[socket.id] = roomId;
// 通知房间内其他用户
socket.to(roomId).emit('user-joined', socket.id);
});
// 转发 offer
socket.on('offer', (data) => {
socket.to(data.target).emit('offer', {
sdp: data.sdp,
type: data.type,
from: socket.id
});
});
// 转发 answer
socket.on('answer', (data) => {
socket.to(data.target).emit('answer', {
sdp: data.sdp,
type: data.type,
from: socket.id
});
});
// 转发 ICE candidate
socket.on('ice-candidate', (data) => {
socket.to(data.target).emit('ice-candidate', {
candidate: data.candidate,
from: socket.id
});
});
// 断开连接
socket.on('disconnect', () => {
const roomId = users[socket.id];
if (roomId) {
socket.to(roomId).emit('user-left', socket.id);
delete users[socket.id];
}
});
});
server.listen(3000, () => {
console.log('信令服务器运行在 http://localhost:3000');
});
```
二、Flutter 端实现
在前文的基础上,调整 Flutter WebRTC 代码以支持与浏览器通信:
```dart
import 'package:flutter/material.dart';
import 'package:flutter_webrtc/flutter_webrtc.dart';
import 'package:socket_io_client/socket_io_client.dart' as IO;
class WebRTCPage extends StatefulWidget {
@override
_WebRTCPageState createState() => _WebRTCPageState();
}
class _WebRTCPageState extends State<WebRTCPage> {
final RTCVideoRenderer _localRenderer = RTCVideoRenderer();
final RTCVideoRenderer _remoteRenderer = RTCVideoRenderer();
MediaStream? _localStream;
RTCPeerConnection? _peerConnection;
IO.Socket? _socket;
String roomId = "test_room";
@override
void initState() {
super.initState();
initRenderers();
_connectSocket();
}
// 初始化视频渲染器
Future<void> initRenderers() async {
await _localRenderer.initialize();
await _remoteRenderer.initialize();
}
// 连接到信令服务器
void _connectSocket() {
_socket = IO.io('http://your-signaling-server:3000', <String, dynamic>{
'transports': ['websocket'],
'autoConnect': true,
});
_socket!.on('connect', (_) {
print('已连接到信令服务器');
_socket!.emit('join', roomId);
_initWebRTC();
});
_socket!.on('user-joined', (id) {
print('新用户加入: $id');
_createOffer(id);
});
_socket!.on('offer', (data) async {
print('收到 offer');
await _handleOffer(data);
});
_socket!.on('answer', (data) async {
print('收到 answer');
await _handleAnswer(data);
});
_socket!.on('ice-candidate', (data) async {
print('收到 ICE candidate');
await _addIceCandidate(data);
});
}
// 初始化WebRTC
Future<void> _initWebRTC() async {
// 获取本地媒体流
final Map<String, dynamic> mediaConstraints = {
'audio': true,
'video': {
'facingMode': 'user',
}
};
_localStream = await navigator.mediaDevices.getUserMedia(mediaConstraints);
setState(() {
_localRenderer.srcObject = _localStream;
});
// 创建PeerConnection配置
final Map<String, dynamic> config = {
"iceServers": [
{"urls": "stun:stun.l.google.com:19302"},
// 添加TURN服务器以提高连接成功率
]
};
// 创建RTC约束条件,明确使用统一计划
final Map<String, dynamic> offerSdpConstraints = {
"mandatory": {
"OfferToReceiveAudio": true,
"OfferToReceiveVideo": true,
},
"optional": [],
};
// 创建PeerConnection
_peerConnection = await createPeerConnection(config, offerSdpConstraints);
// 添加本地媒体流
_localStream!.getTracks().forEach((track) {
_peerConnection!.addTrack(track, _localStream!);
});
// 监听远程流
_peerConnection!.onTrack = (RTCTrackEvent event) {
print("收到远程媒体流");
if (event.streams.isNotEmpty) {
setState(() {
_remoteRenderer.srcObject = event.streams[0];
});
}
};
// 监听ICE候选项
_peerConnection!.onIceCandidate = (RTCIceCandidate candidate) {
if(_socket != null && candidate.candidate != null) {
_socket!.emit('ice-candidate', {
'target': 'browser', // 发送给浏览器端
'candidate': candidate.toMap(),
});
}
};
}
// 创建Offer
Future<void> _createOffer(String targetId) async {
RTCSessionDescription description = await _peerConnection!.createOffer();
await _peerConnection!.setLocalDescription(description);
// 发送Offer给对方
_socket!.emit('offer', {
'target': targetId,
'type': description.type,
'sdp': description.sdp,
});
}
// 处理收到的Offer
Future<void> _handleOffer(dynamic data) async {
// 确保已创建PeerConnection
if (_peerConnection == null) {
await _initWebRTC();
}
// 设置远程描述
await _peerConnection!.setRemoteDescription(
RTCSessionDescription(data['sdp'], data['type']),
);
// 创建并发送Answer
RTCSessionDescription answer = await _peerConnection!.createAnswer();
await _peerConnection!.setLocalDescription(answer);
_socket!.emit('answer', {
'target': data['from'],
'type': answer.type,
'sdp': answer.sdp,
});
}
// 处理收到的Answer
Future<void> _handleAnswer(dynamic data) async {
await _peerConnection!.setRemoteDescription(
RTCSessionDescription(data['sdp'], data['type']),
);
}
// 添加ICE候选项
Future<void> _addIceCandidate(dynamic data) async {
if (_peerConnection != null) {
await _peerConnection!.addCandidate(
RTCIceCandidate(
data['candidate']['candidate'],
data['candidate']['sdpMid'],
data['candidate']['sdpMLineIndex'],
),
);
}
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: Text('Flutter WebRTC')),
body: Column(
children: [
Expanded(
child: Container(
margin: EdgeInsets.all(8.0),
decoration: BoxDecoration(color: Colors.black),
child: RTCVideoView(_localRenderer, mirror: true),
),
),
Expanded(
child: Container(
margin: EdgeInsets.all(8.0),
decoration: BoxDecoration(color: Colors.black),
child: RTCVideoView(_remoteRenderer),
),
),
],
),
);
}
@override
void dispose() {
_localRenderer.dispose();
_remoteRenderer.dispose();
_localStream?.getTracks().forEach((track) => track.stop());
_peerConnection?.close();
_socket?.disconnect();
super.dispose();
}
}
```
三、浏览器端 WebRTC Demo 实现
浏览器端可以通过以下 JavaScript 代码实现与 Flutter 端的通信:
```html
<!DOCTYPE html>
<html>
<head>
<title>WebRTC 浏览器演示</title>
<style>
.videos {
display: flex;
flex-wrap: wrap;
}
video {
width: 45%;
margin: 10px;
background: #000;
}
</style>
</head>
<body>
<h1>WebRTC 浏览器演示</h1>
<div class="videos">
<video id="localVideo" autoplay muted playsinline></video>
<video id="remoteVideo" autoplay playsinline></video>
</div>
<script src="https://cdn.socket.io/4.4.1/socket.io.min.js"></script>
<script>
const socket = io('http://your-signaling-server:3000');
const roomId = 'test_room';
let localStream;
let peerConnection;
// 配置ICE服务器
const configuration = {
iceServers: [
{ urls: 'stun:stun.l.google.com:19302' },
// 添加TURN服务器以提高连接成功率
]
};
// 连接信令服务器
socket.on('connect', async () => {
console.log('已连接到信令服务器');
try {
// 获取本地媒体流
localStream = await navigator.mediaDevices.getUserMedia({
audio: true,
video: true
});
document.getElementById('localVideo').srcObject = localStream;
// 加入房间
socket.emit('join', roomId);
} catch (err) {
console.error('获取媒体流失败:', err);
}
});
// 处理新用户加入
socket.on('user-joined', (id) => {
console.log('新用户加入:', id);
startCall(id);
});
// 处理收到的offer
socket.on('offer', async (data) => {
console.log('收到offer');
await handleOffer(data);
});
// 处理收到的answer
socket.on('answer', async (data) => {
console.log('收到answer');
await handleAnswer(data);
});
// 处理收到的ICE候选项
socket.on('ice-candidate', async (data) => {
console.log('收到ICE候选项');
await addIceCandidate(data);
});
async function createPeerConnection() {
try {
peerConnection = new RTCPeerConnection(configuration);
// 添加本地媒体流
localStream.getTracks().forEach(track => {
peerConnection.addTrack(track, localStream);
});
// 接收远程流
peerConnection.ontrack = (event) => {
console.log('收到远程轨道');
if (event.streams && event.streams[0]) {
document.getElementById('remoteVideo').srcObject = event.streams[0];
}
};
// 处理ICE候选项
peerConnection.onicecandidate = (event) => {
if (event.candidate) {
socket.emit('ice-candidate', {
target: 'flutter', // 发送给Flutter端
candidate: event.candidate
});
}
};
return peerConnection;
} catch (err) {
console.error('创建PeerConnection失败:', err);
}
}
async function startCall(targetId) {
if (!peerConnection) {
await createPeerConnection();
}
try {
// 创建并发送offer
const offer = await peerConnection.createOffer();
await peerConnection.setLocalDescription(offer);
socket.emit('offer', {
target: targetId,
type: offer.type,
sdp: offer.sdp
});
} catch (err) {
console.error('创建offer失败:', err);
}
}
async function handleOffer(data) {
if (!peerConnection) {
await createPeerConnection();
}
try {
// 设置远程描述
await peerConnection.setRemoteDescription(
new RTCSessionDescription({ type: data.type, sdp: data.sdp })
);
// 创建并发送answer
const answer = await peerConnection.createAnswer();
await peerConnection.setLocalDescription(answer);
socket.emit('answer', {
target: data.from,
type: answer.type,
sdp: answer.sdp
});
} catch (err) {
console.error('处理offer失败:', err);
}
}
async function handleAnswer(data) {
try {
await peerConnection.setRemoteDescription(
new RTCSessionDescription({ type: data.type, sdp: data.sdp })
);
} catch (err) {
console.error('处理answer失败:', err);
}
}
async function addIceCandidate(data) {
try {
if (peerConnection) {
await peerConnection.addIceCandidate(new RTCIceCandidate(data.candidate));
}
} catch (err) {
console.error('添加ICE候选项失败:', err);
}
}
</script>
</body>
</html>