在线教育实时互动与直播技术方案:从架构设计到落地实践
在线教育的核心体验在于能否还原真实课堂的互动感。一个优秀的直播课堂系统不仅要保证音视频的清晰流畅,还要支持实时问答、白板演示、屏幕共享、小组讨论等丰富的互动功能。本文将深入探讨在线教育直播系统的技术架构与实现方案。
为什么直播技术如此关键
在线教育与普通直播有着本质区别:
教育场景的特殊需求
互动性要求高:教学需要频繁的师生互动,延迟超过 500ms 就会明显影响交流体验。
稳定性要求极高:一节课 45 分钟,任何卡顿或中断都会影响教学效果和用户体验。
功能复杂度高:需要同时支持视频直播、白板标注、文档共享、举手发言、随堂测验等功能。
规模跨度大:从 1 对 1 私教到万人公开课,系统需要弹性支持。
核心技术指标
| 场景 | 延迟要求 | 并发规模 | 互动程度 |
|---|---|---|---|
| 1对1 私教 | < 200ms | 2人 | 高频双向 |
| 小班课 | < 400ms | 6-20人 | 多人互动 |
| 大班课 | < 1s | 50-500人 | 主播+连麦 |
| 公开课 | < 3s | 1000+ | 单向为主 |
系统架构设计
整体架构
┌─────────────────────────────────────────────────────────────────┐
│ 客户端层 │
│ ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌───────────┐ │
│ │ Web/H5 │ │ iOS App │ │Android App│ │ 桌面端 │ │
│ │ (WebRTC) │ │(Native SDK)│ │(Native SDK)│ │ (Electron)│ │
│ └───────────┘ └───────────┘ └───────────┘ └───────────┘ │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ 接入层 │
│ ┌────────────────────────────────────────────────────────┐ │
│ │ 边缘节点 (Edge) │ │
│ │ - 就近接入,降低首跳延迟 │ │
│ │ - 媒体数据中转/转发 │ │
│ │ - 信令服务代理 │ │
│ └────────────────────────────────────────────────────────┘ │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ 服务层 │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
│ │ 信令服务 │ │ 媒体服务 │ │ 互动服务 │ │ 录制服务 │ │
│ │(Signaling)│ │ (SFU) │ │(IM/RTM) │ │(Recording)│ │
│ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
│ │ 白板服务 │ │ 转码服务 │ │ 课堂管理 │ │ 数据统计 │ │
│ │(Whiteboard)│(Transcode)│ │(Classroom)│ │(Analytics)│ │
│ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ 基础设施层 │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
│ │ Redis │ │ MySQL │ │ Kafka │ │ OSS │ │
│ │ 状态缓存 │ │ 业务数据 │ │ 消息队列 │ │ 对象存储 │ │
│ └──────────┘ └──────────┘ └──────────┘ └──────────┘ │
└─────────────────────────────────────────────────────────────────┘
媒体传输架构
在线教育场景主要使用 SFU (Selective Forwarding Unit) 架构:
┌─────────────┐
│ SFU │
│ 服务器 │
└──────┬──────┘
│
┌─────────────────┼─────────────────┐
│ │ │
▼ ▼ ▼
┌─────────┐ ┌─────────┐ ┌─────────┐
│ 教师 │ │ 学生A │ │ 学生B │
│ 上行1路 │ │ 上行1路 │ │ 下行N路 │
│ 下行N路 │ │ 下行N路 │ │ │
└─────────┘ └─────────┘ └─────────┘
SFU 的优势:
- 服务器只转发,不转码,延迟低
- 可选择性订阅,灵活控制带宽
- 支持大规模并发(单服务器可支持千路流)
核心模块实现
1. WebRTC 音视频模块
媒体采集与发布
// webrtc-client.ts
import { EventEmitter } from 'events';
interface MediaConfig {
video: boolean | MediaTrackConstraints;
audio: boolean | MediaTrackConstraints;
}
interface PublishOptions {
simulcast?: boolean; // 是否开启大小流
screenShare?: boolean;
}
export class WebRTCClient extends EventEmitter {
private localStream: MediaStream | null = null;
private peerConnection: RTCPeerConnection | null = null;
private signaling: SignalingClient;
constructor(config: { signalingUrl: string }) {
super();
this.signaling = new SignalingClient(config.signalingUrl);
this.setupSignalingHandlers();
}
/**
* 采集本地媒体
*/
async captureLocalMedia(config: MediaConfig): Promise<MediaStream> {
try {
// 配置视频参数
const videoConstraints: MediaTrackConstraints =
typeof config.video === 'object' ? config.video : {
width: { ideal: 1280, max: 1920 },
height: { ideal: 720, max: 1080 },
frameRate: { ideal: 24, max: 30 },
facingMode: 'user',
};
// 配置音频参数
const audioConstraints: MediaTrackConstraints =
typeof config.audio === 'object' ? config.audio : {
echoCancellation: true, // 回声消除
noiseSuppression: true, // 噪声抑制
autoGainControl: true, // 自动增益
sampleRate: 48000,
};
this.localStream = await navigator.mediaDevices.getUserMedia({
video: config.video ? videoConstraints : false,
audio: config.audio ? audioConstraints : false,
});
this.emit('localStreamReady', this.localStream);
return this.localStream;
} catch (error) {
this.handleMediaError(error as Error);
throw error;
}
}
/**
* 屏幕共享采集
*/
async captureScreen(): Promise<MediaStream> {
try {
const screenStream = await navigator.mediaDevices.getDisplayMedia({
video: {
cursor: 'always',
displaySurface: 'monitor',
},
audio: true, // 系统音频
});
// 监听用户停止共享
screenStream.getVideoTracks()[0].addEventListener('ended', () => {
this.emit('screenShareEnded');
});
return screenStream;
} catch (error) {
if ((error as Error).name === 'NotAllowedError') {
throw new Error('用户取消了屏幕共享');
}
throw error;
}
}
/**
* 发布媒体流到服务器
*/
async publish(stream: MediaStream, options: PublishOptions = {}): Promise<void> {
// 创建 PeerConnection
this.peerConnection = new RTCPeerConnection({
iceServers: [
{ urls: 'stun:stun.l.google.com:19302' },
{
urls: 'turn:your-turn-server.com:3478',
username: 'username',
credential: 'password',
},
],
// 启用 Unified Plan(标准)
sdpSemantics: 'unified-plan',
});
// 添加轨道
for (const track of stream.getTracks()) {
const sender = this.peerConnection.addTrack(track, stream);
// 开启大小流(Simulcast)
if (options.simulcast && track.kind === 'video') {
this.enableSimulcast(sender);
}
}
// 监听 ICE 候选
this.peerConnection.onicecandidate = (event) => {
if (event.candidate) {
this.signaling.send('ice-candidate', {
candidate: event.candidate,
});
}
};
// 监听连接状态
this.peerConnection.onconnectionstatechange = () => {
const state = this.peerConnection?.connectionState;
this.emit('connectionStateChange', state);
if (state === 'failed') {
this.handleConnectionFailed();
}
};
// 创建并发送 Offer
const offer = await this.peerConnection.createOffer();
await this.peerConnection.setLocalDescription(offer);
this.signaling.send('publish', {
sdp: offer.sdp,
streamId: stream.id,
type: options.screenShare ? 'screen' : 'camera',
});
}
/**
* 开启大小流
* 同时发送高中低三个分辨率的视频流
*/
private enableSimulcast(sender: RTCRtpSender): void {
const params = sender.getParameters();
params.encodings = [
{
rid: 'high',
maxBitrate: 2000000,
maxFramerate: 30,
scaleResolutionDownBy: 1,
},
{
rid: 'medium',
maxBitrate: 500000,
maxFramerate: 24,
scaleResolutionDownBy: 2,
},
{
rid: 'low',
maxBitrate: 150000,
maxFramerate: 15,
scaleResolutionDownBy: 4,
},
];
sender.setParameters(params);
}
/**
* 订阅远端媒体流
*/
async subscribe(streamId: string, options: { videoQuality?: 'high' | 'medium' | 'low' } = {}): Promise<MediaStream> {
return new Promise((resolve, reject) => {
const pc = new RTCPeerConnection({
iceServers: this.getIceServers(),
});
pc.ontrack = (event) => {
const stream = event.streams[0];
this.emit('remoteStreamAdded', { streamId, stream });
resolve(stream);
};
pc.onicecandidate = (event) => {
if (event.candidate) {
this.signaling.send('ice-candidate', {
streamId,
candidate: event.candidate,
});
}
};
// 请求订阅
this.signaling.send('subscribe', {
streamId,
videoQuality: options.videoQuality || 'high',
});
// 等待服务器的 Offer
this.signaling.once(`offer:${streamId}`, async (data: { sdp: string }) => {
try {
await pc.setRemoteDescription({
type: 'offer',
sdp: data.sdp,
});
const answer = await pc.createAnswer();
await pc.setLocalDescription(answer);
this.signaling.send('answer', {
streamId,
sdp: answer.sdp,
});
} catch (error) {
reject(error);
}
});
});
}
private handleMediaError(error: Error): void {
const errorMap: Record<string, string> = {
'NotFoundError': '未找到摄像头或麦克风设备',
'NotAllowedError': '请允许浏览器访问摄像头和麦克风',
'NotReadableError': '设备可能被其他程序占用',
'OverconstrainedError': '设备不支持请求的分辨率',
};
const message = errorMap[error.name] || '媒体设备访问失败';
this.emit('error', new Error(message));
}
}
2. 实时消息模块
课堂中的各种互动(聊天、举手、点名等)依赖实时消息系统:
// realtime-messaging.ts
interface Message {
id: string;
type: 'chat' | 'raise_hand' | 'answer' | 'system';
senderId: string;
senderName: string;
content: any;
timestamp: number;
}
interface RoomState {
participants: Map<string, Participant>;
raisedHands: Set<string>;
speakingUser: string | null;
classStatus: 'waiting' | 'ongoing' | 'ended';
}
export class ClassroomMessaging {
private ws: WebSocket;
private roomId: string;
private userId: string;
private state: RoomState;
private eventHandlers: Map<string, Function[]> = new Map();
// 消息队列和重连机制
private messageQueue: Message[] = [];
private reconnectAttempts = 0;
private readonly maxReconnectAttempts = 5;
constructor(config: {
wsUrl: string;
roomId: string;
userId: string;
token: string;
}) {
this.roomId = config.roomId;
this.userId = config.userId;
this.state = {
participants: new Map(),
raisedHands: new Set(),
speakingUser: null,
classStatus: 'waiting',
};
this.connect(config.wsUrl, config.token);
}
private connect(url: string, token: string): void {
this.ws = new WebSocket(`${url}?token=${token}&room=${this.roomId}`);
this.ws.onopen = () => {
console.log('WebSocket connected');
this.reconnectAttempts = 0;
// 发送队列中的消息
while (this.messageQueue.length > 0) {
const msg = this.messageQueue.shift();
this.ws.send(JSON.stringify(msg));
}
this.emit('connected');
};
this.ws.onmessage = (event) => {
try {
const data = JSON.parse(event.data);
this.handleMessage(data);
} catch (error) {
console.error('Failed to parse message:', error);
}
};
this.ws.onclose = (event) => {
if (!event.wasClean) {
this.attemptReconnect(url, token);
}
};
this.ws.onerror = (error) => {
console.error('WebSocket error:', error);
this.emit('error', error);
};
}
private attemptReconnect(url: string, token: string): void {
if (this.reconnectAttempts >= this.maxReconnectAttempts) {
this.emit('connectionFailed');
return;
}
this.reconnectAttempts++;
const delay = Math.min(1000 * Math.pow(2, this.reconnectAttempts), 30000);
console.log(`Reconnecting in ${delay}ms (attempt ${this.reconnectAttempts})`);
setTimeout(() => {
this.connect(url, token);
}, delay);
}
private handleMessage(data: any): void {
switch (data.type) {
case 'room_state':
this.syncRoomState(data.state);
break;
case 'user_join':
this.state.participants.set(data.user.id, data.user);
this.emit('userJoin', data.user);
break;
case 'user_leave':
this.state.participants.delete(data.userId);
this.emit('userLeave', data.userId);
break;
case 'chat_message':
this.emit('chatMessage', data.message);
break;
case 'raise_hand':
this.state.raisedHands.add(data.userId);
this.emit('handRaised', data.userId);
break;
case 'lower_hand':
this.state.raisedHands.delete(data.userId);
this.emit('handLowered', data.userId);
break;
case 'speaking_change':
this.state.speakingUser = data.userId;
this.emit('speakingChange', data.userId);
break;
case 'class_status':
this.state.classStatus = data.status;
this.emit('classStatusChange', data.status);
break;
case 'quiz_start':
this.emit('quizStart', data.quiz);
break;
case 'quiz_result':
this.emit('quizResult', data.result);
break;
}
}
/**
* 发送聊天消息
*/
sendChatMessage(content: string): void {
this.send({
type: 'chat_message',
content,
});
}
/**
* 举手
*/
raiseHand(): void {
this.send({ type: 'raise_hand' });
}
/**
* 放下举手
*/
lowerHand(): void {
this.send({ type: 'lower_hand' });
}
/**
* 邀请发言(教师操作)
*/
inviteToSpeak(userId: string): void {
this.send({
type: 'invite_speak',
targetUserId: userId,
});
}
/**
* 发起随堂测验(教师操作)
*/
startQuiz(quiz: {
question: string;
options: string[];
correctIndex: number;
duration: number;
}): void {
this.send({
type: 'start_quiz',
quiz,
});
}
/**
* 提交答案
*/
submitQuizAnswer(quizId: string, answerIndex: number): void {
this.send({
type: 'quiz_answer',
quizId,
answerIndex,
});
}
private send(data: any): void {
const message = {
...data,
senderId: this.userId,
timestamp: Date.now(),
};
if (this.ws.readyState === WebSocket.OPEN) {
this.ws.send(JSON.stringify(message));
} else {
// 离线时加入队列
this.messageQueue.push(message);
}
}
on(event: string, handler: Function): void {
if (!this.eventHandlers.has(event)) {
this.eventHandlers.set(event, []);
}
this.eventHandlers.get(event)!.push(handler);
}
private emit(event: string, ...args: any[]): void {
const handlers = this.eventHandlers.get(event) || [];
handlers.forEach(handler => handler(...args));
}
}
3. 互动白板模块
白板是在线教学的核心工具,需要支持多人实时协作:
// whiteboard.ts
interface WhiteboardOperation {
type: 'draw' | 'erase' | 'shape' | 'text' | 'image' | 'clear' | 'undo';
userId: string;
timestamp: number;
data: any;
}
interface DrawData {
tool: 'pen' | 'highlighter' | 'eraser';
points: { x: number; y: number }[];
color: string;
width: number;
}
interface ShapeData {
shapeType: 'rectangle' | 'circle' | 'arrow' | 'line';
start: { x: number; y: number };
end: { x: number; y: number };
color: string;
fill?: string;
}
export class InteractiveWhiteboard {
private canvas: HTMLCanvasElement;
private ctx: CanvasRenderingContext2D;
private operations: WhiteboardOperation[] = [];
private socket: WebSocket;
// 绘画状态
private isDrawing = false;
private currentPath: { x: number; y: number }[] = [];
private currentTool: 'pen' | 'highlighter' | 'eraser' = 'pen';
private currentColor = '#000000';
private currentWidth = 3;
// 操作历史(用于撤销)
private history: WhiteboardOperation[] = [];
private historyIndex = -1;
constructor(canvas: HTMLCanvasElement, socket: WebSocket) {
this.canvas = canvas;
this.ctx = canvas.getContext('2d')!;
this.socket = socket;
this.setupCanvas();
this.bindEvents();
this.setupSocketHandlers();
}
private setupCanvas(): void {
// 设置 canvas 尺寸(考虑 DPR)
const dpr = window.devicePixelRatio || 1;
const rect = this.canvas.getBoundingClientRect();
this.canvas.width = rect.width * dpr;
this.canvas.height = rect.height * dpr;
this.ctx.scale(dpr, dpr);
// 设置默认样式
this.ctx.lineCap = 'round';
this.ctx.lineJoin = 'round';
}
private bindEvents(): void {
// 鼠标事件
this.canvas.addEventListener('mousedown', this.handleStart.bind(this));
this.canvas.addEventListener('mousemove', this.handleMove.bind(this));
this.canvas.addEventListener('mouseup', this.handleEnd.bind(this));
this.canvas.addEventListener('mouseleave', this.handleEnd.bind(this));
// 触摸事件(移动端支持)
this.canvas.addEventListener('touchstart', this.handleTouchStart.bind(this));
this.canvas.addEventListener('touchmove', this.handleTouchMove.bind(this));
this.canvas.addEventListener('touchend', this.handleEnd.bind(this));
}
private setupSocketHandlers(): void {
this.socket.addEventListener('message', (event) => {
const data = JSON.parse(event.data);
if (data.type === 'whiteboard_operation') {
this.applyRemoteOperation(data.operation);
} else if (data.type === 'whiteboard_sync') {
this.syncFromOperations(data.operations);
}
});
}
private getPoint(event: MouseEvent | Touch): { x: number; y: number } {
const rect = this.canvas.getBoundingClientRect();
return {
x: event.clientX - rect.left,
y: event.clientY - rect.top,
};
}
private handleStart(event: MouseEvent): void {
this.isDrawing = true;
const point = this.getPoint(event);
this.currentPath = [point];
this.ctx.beginPath();
this.ctx.moveTo(point.x, point.y);
}
private handleMove(event: MouseEvent): void {
if (!this.isDrawing) return;
const point = this.getPoint(event);
this.currentPath.push(point);
// 本地绘制
this.drawSegment(point);
// 实时同步(节流)
this.throttledBroadcast();
}
private handleEnd(): void {
if (!this.isDrawing) return;
this.isDrawing = false;
if (this.currentPath.length > 0) {
const operation: WhiteboardOperation = {
type: 'draw',
userId: this.getCurrentUserId(),
timestamp: Date.now(),
data: {
tool: this.currentTool,
points: this.currentPath,
color: this.currentColor,
width: this.currentWidth,
} as DrawData,
};
this.addToHistory(operation);
this.broadcastOperation(operation);
}
this.currentPath = [];
}
private handleTouchStart(event: TouchEvent): void {
event.preventDefault();
const touch = event.touches[0];
this.handleStart({ clientX: touch.clientX, clientY: touch.clientY } as MouseEvent);
}
private handleTouchMove(event: TouchEvent): void {
event.preventDefault();
const touch = event.touches[0];
this.handleMove({ clientX: touch.clientX, clientY: touch.clientY } as MouseEvent);
}
private drawSegment(point: { x: number; y: number }): void {
this.ctx.strokeStyle = this.currentTool === 'eraser'
? '#ffffff'
: this.currentColor;
this.ctx.lineWidth = this.currentTool === 'eraser'
? this.currentWidth * 3
: this.currentWidth;
this.ctx.globalAlpha = this.currentTool === 'highlighter' ? 0.3 : 1;
this.ctx.lineTo(point.x, point.y);
this.ctx.stroke();
this.ctx.beginPath();
this.ctx.moveTo(point.x, point.y);
}
/**
* 应用远程操作
*/
private applyRemoteOperation(operation: WhiteboardOperation): void {
switch (operation.type) {
case 'draw':
this.replayDrawOperation(operation.data as DrawData);
break;
case 'shape':
this.drawShape(operation.data as ShapeData);
break;
case 'clear':
this.clearCanvas();
break;
case 'undo':
this.handleRemoteUndo(operation.userId);
break;
}
this.operations.push(operation);
}
private replayDrawOperation(data: DrawData): void {
if (data.points.length < 2) return;
this.ctx.beginPath();
this.ctx.strokeStyle = data.tool === 'eraser' ? '#ffffff' : data.color;
this.ctx.lineWidth = data.tool === 'eraser' ? data.width * 3 : data.width;
this.ctx.globalAlpha = data.tool === 'highlighter' ? 0.3 : 1;
this.ctx.moveTo(data.points[0].x, data.points[0].y);
for (let i = 1; i < data.points.length; i++) {
this.ctx.lineTo(data.points[i].x, data.points[i].y);
}
this.ctx.stroke();
this.ctx.globalAlpha = 1;
}
private drawShape(data: ShapeData): void {
this.ctx.beginPath();
this.ctx.strokeStyle = data.color;
this.ctx.lineWidth = 2;
switch (data.shapeType) {
case 'rectangle':
const width = data.end.x - data.start.x;
const height = data.end.y - data.start.y;
this.ctx.rect(data.start.x, data.start.y, width, height);
break;
case 'circle':
const radius = Math.sqrt(
Math.pow(data.end.x - data.start.x, 2) +
Math.pow(data.end.y - data.start.y, 2)
);
this.ctx.arc(data.start.x, data.start.y, radius, 0, 2 * Math.PI);
break;
case 'arrow':
this.drawArrow(data.start, data.end);
break;
case 'line':
this.ctx.moveTo(data.start.x, data.start.y);
this.ctx.lineTo(data.end.x, data.end.y);
break;
}
if (data.fill) {
this.ctx.fillStyle = data.fill;
this.ctx.fill();
}
this.ctx.stroke();
}
private drawArrow(start: { x: number; y: number }, end: { x: number; y: number }): void {
const headLength = 15;
const angle = Math.atan2(end.y - start.y, end.x - start.x);
// 主线
this.ctx.moveTo(start.x, start.y);
this.ctx.lineTo(end.x, end.y);
// 箭头
this.ctx.lineTo(
end.x - headLength * Math.cos(angle - Math.PI / 6),
end.y - headLength * Math.sin(angle - Math.PI / 6)
);
this.ctx.moveTo(end.x, end.y);
this.ctx.lineTo(
end.x - headLength * Math.cos(angle + Math.PI / 6),
end.y - headLength * Math.sin(angle + Math.PI / 6)
);
}
/**
* 撤销操作
*/
undo(): void {
if (this.historyIndex < 0) return;
const operation: WhiteboardOperation = {
type: 'undo',
userId: this.getCurrentUserId(),
timestamp: Date.now(),
data: { operationIndex: this.historyIndex },
};
this.historyIndex--;
this.redrawFromHistory();
this.broadcastOperation(operation);
}
/**
* 重绘画布
*/
private redrawFromHistory(): void {
this.clearCanvas();
for (let i = 0; i <= this.historyIndex; i++) {
const op = this.history[i];
if (op.type === 'draw') {
this.replayDrawOperation(op.data as DrawData);
} else if (op.type === 'shape') {
this.drawShape(op.data as ShapeData);
}
}
}
/**
* 清空画布
*/
clear(): void {
const operation: WhiteboardOperation = {
type: 'clear',
userId: this.getCurrentUserId(),
timestamp: Date.now(),
data: null,
};
this.clearCanvas();
this.history = [];
this.historyIndex = -1;
this.broadcastOperation(operation);
}
private clearCanvas(): void {
this.ctx.fillStyle = '#ffffff';
this.ctx.fillRect(0, 0, this.canvas.width, this.canvas.height);
}
/**
* 设置绘画工具
*/
setTool(tool: 'pen' | 'highlighter' | 'eraser'): void {
this.currentTool = tool;
}
setColor(color: string): void {
this.currentColor = color;
}
setWidth(width: number): void {
this.currentWidth = width;
}
// 节流广播
private broadcastTimeout: number | null = null;
private throttledBroadcast(): void {
if (this.broadcastTimeout) return;
this.broadcastTimeout = window.setTimeout(() => {
this.broadcastOperation({
type: 'draw',
userId: this.getCurrentUserId(),
timestamp: Date.now(),
data: {
tool: this.currentTool,
points: this.currentPath.slice(-10), // 只发送最近的点
color: this.currentColor,
width: this.currentWidth,
},
});
this.broadcastTimeout = null;
}, 16); // ~60fps
}
private broadcastOperation(operation: WhiteboardOperation): void {
this.socket.send(JSON.stringify({
type: 'whiteboard_operation',
operation,
}));
}
}
延迟优化策略
网络传输优化
// network-optimizer.ts
export class NetworkOptimizer {
private rttHistory: number[] = [];
private jitterHistory: number[] = [];
private packetLossHistory: number[] = [];
/**
* 根据网络状况动态调整视频质量
*/
async adaptVideoQuality(peerConnection: RTCPeerConnection): Promise<void> {
const stats = await peerConnection.getStats();
const networkQuality = this.analyzeNetworkQuality(stats);
const senders = peerConnection.getSenders();
for (const sender of senders) {
if (sender.track?.kind === 'video') {
await this.adjustEncodings(sender, networkQuality);
}
}
}
private analyzeNetworkQuality(stats: RTCStatsReport): NetworkQuality {
let rtt = 0;
let packetLoss = 0;
let jitter = 0;
stats.forEach((report) => {
if (report.type === 'candidate-pair' && report.state === 'succeeded') {
rtt = report.currentRoundTripTime * 1000 || 0;
}
if (report.type === 'inbound-rtp' && report.kind === 'video') {
packetLoss = report.packetsLost / (report.packetsReceived + report.packetsLost) * 100;
jitter = report.jitter * 1000;
}
});
// 记录历史数据
this.rttHistory.push(rtt);
this.jitterHistory.push(jitter);
this.packetLossHistory.push(packetLoss);
// 只保留最近 10 个样本
if (this.rttHistory.length > 10) {
this.rttHistory.shift();
this.jitterHistory.shift();
this.packetLossHistory.shift();
}
// 计算平均值
const avgRtt = this.average(this.rttHistory);
const avgJitter = this.average(this.jitterHistory);
const avgPacketLoss = this.average(this.packetLossHistory);
// 评估网络质量
return {
rtt: avgRtt,
jitter: avgJitter,
packetLoss: avgPacketLoss,
level: this.getQualityLevel(avgRtt, avgPacketLoss),
};
}
private getQualityLevel(rtt: number, packetLoss: number): 'excellent' | 'good' | 'fair' | 'poor' {
if (rtt < 100 && packetLoss < 1) return 'excellent';
if (rtt < 200 && packetLoss < 3) return 'good';
if (rtt < 500 && packetLoss < 5) return 'fair';
return 'poor';
}
private async adjustEncodings(
sender: RTCRtpSender,
quality: NetworkQuality
): Promise<void> {
const params = sender.getParameters();
switch (quality.level) {
case 'excellent':
// 使用高质量设置
params.encodings[0].maxBitrate = 2000000;
params.encodings[0].maxFramerate = 30;
break;
case 'good':
params.encodings[0].maxBitrate = 1000000;
params.encodings[0].maxFramerate = 24;
break;
case 'fair':
params.encodings[0].maxBitrate = 500000;
params.encodings[0].maxFramerate = 15;
break;
case 'poor':
params.encodings[0].maxBitrate = 200000;
params.encodings[0].maxFramerate = 10;
break;
}
await sender.setParameters(params);
}
private average(arr: number[]): number {
return arr.reduce((a, b) => a + b, 0) / arr.length;
}
}
interface NetworkQuality {
rtt: number;
jitter: number;
packetLoss: number;
level: 'excellent' | 'good' | 'fair' | 'poor';
}
音视频同步
// av-sync.ts
export class AVSynchronizer {
private audioContext: AudioContext;
private videoElement: HTMLVideoElement;
private syncThreshold = 40; // 40ms 同步阈值
constructor(video: HTMLVideoElement, audio: HTMLAudioElement) {
this.videoElement = video;
this.audioContext = new AudioContext();
}
/**
* 检测并修正音视频不同步
*/
checkAndFixSync(): void {
const videoTime = this.videoElement.currentTime;
const audioTime = this.getAudioTime();
const drift = Math.abs(videoTime - audioTime) * 1000;
if (drift > this.syncThreshold) {
console.log(`A/V drift detected: ${drift.toFixed(2)}ms`);
this.resync();
}
}
private resync(): void {
// 以音频为基准,调整视频播放速度
const audioTime = this.getAudioTime();
const videoTime = this.videoElement.currentTime;
if (videoTime < audioTime) {
// 视频落后,加速播放
this.videoElement.playbackRate = 1.1;
} else {
// 视频超前,减速播放
this.videoElement.playbackRate = 0.9;
}
// 短暂调整后恢复正常速度
setTimeout(() => {
this.videoElement.playbackRate = 1.0;
}, 500);
}
private getAudioTime(): number {
return this.audioContext.currentTime;
}
}
课堂录制与回放
录制架构
┌─────────────────────────────────────┐
│ Recording Server │
│ ┌─────────┐ ┌─────────┐ ┌─────────┐│
│ │ 视频流 │ │ 音频流 │ │ 白板流 ││
│ │ 录制器 │ │ 录制器 │ │ 录制器 ││
│ └────┬────┘ └────┬────┘ └────┬────┘│
│ │ │ │ │
│ ▼ ▼ ▼ │
│ ┌──────────────────────────────────┐│
│ │ 合成引擎 ││
│ │ - 音视频合成 ││
│ │ - 白板叠加 ││
│ │ - 时间轴对齐 ││
│ └──────────────────────────────────┘│
│ │ │
│ ▼ │
│ ┌──────────────────────────────────┐│
│ │ 转码 & 存储 ││
│ │ - 多码率输出 ││
│ │ - 切片存储 ││
│ │ - CDN 分发 ││
│ └──────────────────────────────────┘│
└─────────────────────────────────────┘
录制实现
// recording-service.ts
interface RecordingConfig {
roomId: string;
outputFormat: 'mp4' | 'webm';
resolution: { width: number; height: number };
frameRate: number;
includeWhiteboard: boolean;
}
export class RecordingService {
private mediaRecorder: MediaRecorder | null = null;
private recordedChunks: Blob[] = [];
private isRecording = false;
/**
* 开始录制
*/
async startRecording(
streams: { video: MediaStream; audio: MediaStream; whiteboard?: MediaStream },
config: RecordingConfig
): Promise<void> {
// 合成多个流
const composedStream = await this.composeStreams(streams, config);
// 选择编码器
const mimeType = this.getSupportedMimeType();
this.mediaRecorder = new MediaRecorder(composedStream, {
mimeType,
videoBitsPerSecond: 2500000,
audioBitsPerSecond: 128000,
});
this.mediaRecorder.ondataavailable = (event) => {
if (event.data.size > 0) {
this.recordedChunks.push(event.data);
// 实时上传到服务器
this.uploadChunk(event.data);
}
};
this.mediaRecorder.start(1000); // 每秒触发一次 ondataavailable
this.isRecording = true;
}
/**
* 停止录制
*/
stopRecording(): Promise<Blob> {
return new Promise((resolve) => {
if (!this.mediaRecorder || !this.isRecording) {
resolve(new Blob());
return;
}
this.mediaRecorder.onstop = () => {
const blob = new Blob(this.recordedChunks, {
type: this.mediaRecorder!.mimeType,
});
this.recordedChunks = [];
this.isRecording = false;
resolve(blob);
};
this.mediaRecorder.stop();
});
}
/**
* 合成多个媒体流
*/
private async composeStreams(
streams: { video: MediaStream; audio: MediaStream; whiteboard?: MediaStream },
config: RecordingConfig
): Promise<MediaStream> {
const canvas = document.createElement('canvas');
canvas.width = config.resolution.width;
canvas.height = config.resolution.height;
const ctx = canvas.getContext('2d')!;
// 创建视频元素
const videoElement = document.createElement('video');
videoElement.srcObject = streams.video;
videoElement.muted = true;
await videoElement.play();
// 白板元素(如果有)
let whiteboardVideo: HTMLVideoElement | null = null;
if (streams.whiteboard) {
whiteboardVideo = document.createElement('video');
whiteboardVideo.srcObject = streams.whiteboard;
whiteboardVideo.muted = true;
await whiteboardVideo.play();
}
// 绘制循环
const drawFrame = () => {
if (!this.isRecording) return;
// 清空画布
ctx.fillStyle = '#ffffff';
ctx.fillRect(0, 0, canvas.width, canvas.height);
if (whiteboardVideo) {
// 白板占主要区域
ctx.drawImage(whiteboardVideo, 0, 0, canvas.width, canvas.height);
// 视频小窗口(右上角)
const pipWidth = canvas.width * 0.2;
const pipHeight = (pipWidth / 16) * 9;
ctx.drawImage(
videoElement,
canvas.width - pipWidth - 10,
10,
pipWidth,
pipHeight
);
} else {
// 只有视频时全屏显示
ctx.drawImage(videoElement, 0, 0, canvas.width, canvas.height);
}
requestAnimationFrame(drawFrame);
};
drawFrame();
// 创建合成流
const canvasStream = canvas.captureStream(config.frameRate);
// 添加音频轨道
const audioTracks = streams.audio.getAudioTracks();
audioTracks.forEach(track => canvasStream.addTrack(track));
return canvasStream;
}
private getSupportedMimeType(): string {
const types = [
'video/webm;codecs=vp9,opus',
'video/webm;codecs=vp8,opus',
'video/webm',
'video/mp4',
];
for (const type of types) {
if (MediaRecorder.isTypeSupported(type)) {
return type;
}
}
throw new Error('No supported video format found');
}
private async uploadChunk(chunk: Blob): Promise<void> {
const formData = new FormData();
formData.append('chunk', chunk);
formData.append('timestamp', Date.now().toString());
try {
await fetch('/api/recording/chunk', {
method: 'POST',
body: formData,
});
} catch (error) {
console.error('Failed to upload chunk:', error);
}
}
}
总结
构建在线教育直播系统需要综合考虑:
- 低延迟架构:使用 WebRTC + SFU 实现亚秒级延迟
- 稳定性保障:断线重连、网络自适应、多级容错
- 丰富的互动:白板、聊天、举手、测验等功能
- 录制回放:支持课堂录制和点播回看
- 规模扩展:从 1对1 到万人公开课的弹性支持
在线教育的技术门槛较高,建议根据业务规模选择合适的方案:初创团队可以使用第三方 SDK(如声网、腾讯云),成熟团队可以基于开源方案(如 Jitsi、Janus)二次开发。
延伸阅读
- WebRTC 官方文档 - WebRTC 技术规范
- MediaSoup - 开源 SFU 服务器
- 声网 Agora - 商业实时音视频服务
- 在线白板开源方案 - Excalidraw 白板


