点击蓝字 ╳ 关注我们
开源项目 openharmony是每个人的 openharmony
巴延兴
深圳开鸿数字产业发展有限公司
资深os框架开发工程师
以下内容来自嘉宾分享,不代表开放原子开源基金会观点
一、简介
媒体子系统为开发者提供一套接口,方便开发者使用系统的媒体资源,主要包含音视频开发、相机开发、流媒体开发等模块。每个模块都提供给上层应用对应的接口,本文会对音视频开发中的音视频播放框架做一个详细的介绍。
二、目录
foundation/multimedia/media_standard├── frameworks #框架代码
│ ├── js
│ │ ├── player
│ ├── native
│ │ ├── player #native实现
│ └── videodisplaymanager #显示管理
│ ├── include
│ └── src
├── interfaces
│ ├── inner_api #内部接口
│ │ └── native
│ └── kits #外部js接口
├── sa_profile #服务配置文件
└── services
├── engine #engine代码
│ └── gstreamer
├── etc #服务配置文件
├── include #头文件
└── services
├── sa_media #media服务
│ ├── client #media客户端
│ ├── ipc #media ipc调用
│ └── server #media服务端
├── factory #engine工厂
└── player #player服务
├── client #player客户端
├── ipc #player ipc调用
└── server #player服务端
三、播放的总体流程
四、native接口使用
openharmony系统中,音视频播放通过n-api接口提供给上层js调用,n-api相当于是js和native之间的桥梁,在openharmony源码中,提供了c++直接调用的音视频播放例子,在foundation/multimedia/player_framework/test/nativedemo/player目录中。void playerdemo::runcase(const string &path)
{
player_ = ohos::createplayer();
if (player_ == nullptr) {
cout << player_ is null
int32_t ret = player_->setplayercallback(cb);
if (ret != 0) {
cout << setplayercallback fail << endl;
}
if (selectsource(path) != 0) {
cout << setsource fail if (ret != 0) {
cout << setvideosurface fail << endl;
}
}
setvideoscaletype();
if (selectrenderermode() != 0) {
cout << set renderer info fail if (ret != 0) {
cout << prepareasync fail << endl;
return;
}
cout << enter your step: if (func() != 0) {
cout << operation error }
continue;
} else if (cmd.find(quit) != std::npos || cmd == q) {
break;
} else {
docmd(cmd);
continue;
}
}
}
void playerdemo::registertable()
{
(void)playertable_.emplace(prepare, std::bind(&player::prepare, player_));
(void)playertable_.emplace(prepareasync, std::bind(&player::prepareasync, player_));
(void)playertable_.emplace(, std::bind(&player::play, player_)); // enter -> play
(void)playertable_.emplace(play, std::bind(&player::play, player_));
(void)playertable_.emplace(pause, std::bind(&player::pause, player_));
(void)playertable_.emplace(stop, std::bind(&player::stop, player_));
(void)playertable_.emplace(reset, std::bind(&player::reset, player_));
(void)playertable_.emplace(release, std::bind(&player::release, player_));
(void)playertable_.emplace(isplaying, std::bind(&playerdemo::getplaying, this));
(void)playertable_.emplace(isloop, std::bind(&playerdemo::getlooping, this));
(void)playertable_.emplace(speed, std::bind(&playerdemo::getplaybackspeed, this));
}以上的donext方法中核心的代码是func()的调用,这个func就是之前注册进map中字符串对应的方法,在registertable方法中将空字符串和play对绑定为player::play方法,默认不输入命令参数时,是播放操作。
五、调用流程
左右滑动查看更多
本段落主要针对媒体播放的框架层代码进行分析,所以在流程中涉及到了ipc调用相关的客户端和服务端,代码暂且分析到调用gstreamer引擎。首先sample通过playerfactory创建了一个播放器实例(playerimpl对象),创建过程中调用init函数。int32_t playerimpl::init()
{
playerservice_ = mediaservicefactory::getinstance().createplayerservice();
check_and_return_ret_log(playerservice_ != nullptr, mserr_unknown, failed to create player service);
return mserr_ok;
}mediaservicefactory::getinstance()返回的是mediaclient对象,所以createplayerservice函数实际上是调用了mediaclient对应的方法。std::shared_ptr mediaclient::createplayerservice()
{
std::lock_guard lock(mutex_);
if (!isalived()) {
media_loge(media service does not exist.);
return nullptr;
}
sptr object = mediaproxy_->getsubsystemability(
istandardmediaservice::media_player, listenerstub_->asobject());
check_and_return_ret_log(object != nullptr, nullptr, player proxy object is nullptr.);
sptr playerproxy = iface_cast(object);
check_and_return_ret_log(playerproxy != nullptr, nullptr, player proxy is nullptr.);
std::shared_ptr player = playerclient::create(playerproxy);
check_and_return_ret_log(player != nullptr, nullptr, failed to create player client.);
playerclientlist_.push_back(player);
return player;
}这个方法中主要通过playerclient::create(playerproxy)方法创建了playerclient实例,并且将该实例一层层向上传,最终传给了playerimpl的playerservice_变量,后续对于播放器的操作,playerimpl都是通过调用playerclient实例实现的。int32_t playerimpl::play()
{
check_and_return_ret_log(playerservice_ != nullptr, mserr_invalid_operation, player service does not exist..);
media_logw(kpi-trace: playerimpl play in);
return playerservice_->play();
}
int32_t playerimpl::prepare()
{
check_and_return_ret_log(playerservice_ != nullptr, mserr_invalid_operation, player service does not exist..);
media_logw(kpi-trace: playerimpl prepare in);
return playerservice_->prepare();
}
int32_t playerimpl::prepareasync()
{
check_and_return_ret_log(playerservice_ != nullptr, mserr_invalid_operation, player service does not exist..);
media_logw(kpi-trace: playerimpl prepareasync in);
return playerservice_->prepareasync();
}对于playerimpl来说,playerservice_指向的playerclient就是具体的实现,playerclient的实现是通过ipc的远程调用来实现的,具体地是通过ipc中的proxy端向远端服务发起远程调用请求。我们以播放play为例:int32_t playerclient::play()
{
std::lock_guard lock(mutex_);
check_and_return_ret_log(playerproxy_ != nullptr, mserr_no_memory, player service does not exist..);
return playerproxy_->play();
}int32_t playerserviceproxy::play()
{
messageparcel data;
messageparcel reply;
messageoption option;
if (!data.writeinterfacetoken(playerserviceproxy::getdescriptor())) {
media_loge(failed to write descriptor);
return mserr_unknown;
}
int error = remote()->sendrequest(play, data, reply, option);
if (error != mserr_ok) {
media_loge(play failed, error: %{public}d, error);
return error;
}
return reply.readint32();
}proxy端发送调用请求后,对应的stub端会在playerservicestub::onremoterequest接收到请求,根据请求的参数进行对应的函数调用。播放操作对应的调用stub的play方法。int32_t playerservicestub::play()
{
mediatrace trace(binder::play);
check_and_return_ret_log(playerserver_ != nullptr, mserr_no_memory, player server is nullptr);
return playerserver_->play();
}这里最终是通过playerserver_调用play函数。playerserver_在stub初始化的时候通过playerserver::create()方式来获取得到。也就是playerserver。std::shared_ptr playerserver::create()
{
std::shared_ptr server = std::make_shared();
check_and_return_ret_log(server != nullptr, nullptr, failed to new playerserver);
(void)server->init();
return server;
}最终我们的play调用到了playerserver的play()。在媒体播放的整个过程中会涉及到很多的状态,所以在play中进行一些状态的判读后调用onplay方法。这个方法中发起了一个播放的任务。int32_t playerserver::play()
{
std::lock_guard lock(mutex_);
if (lastopstatus_ == player_prepared || lastopstatus_ == player_playback_complete ||
lastopstatus_ == player_paused) {
return onplay();
} else {
media_loge(can not play, currentstate is %{public}s, getstatusdescription(lastopstatus_).c_str());
return mserr_invalid_operation;
}
}
int32_t playerserver::onplay()
{
auto playingtask = std::make_shared([this]() {
mediatrace::tracebegin(playerserver::play, fake_pointer(this));
auto currstate = std::static_pointer_cast(getcurrstate());
(void)currstate->play();
});
int ret = taskmgr_.launchtask(playingtask, playerservertasktype::state_change);
check_and_return_ret_log(ret == mserr_ok, ret, play failed);
lastopstatus_ = player_started;
return mserr_ok;
}在播放任务中调用了playerserver::play()int32_t playerserver::play()
{
return server_.handleplay();
}在play里面直接调用playerserver的handleplay方法,handleplay方法通过playerengine_调用到了gstreamer引擎,gstreamer是最终播放的实现。int32_t playerserver::handleplay()
{
int32_t ret = playerengine_->play();
check_and_return_ret_log(ret == mserr_ok, mserr_invalid_operation, engine play failed!);
return mserr_ok;
}
六、总结
本文主要对openharmony 3.2 beta多媒体子系统的媒体播放进行介绍,首先梳理了整体的播放流程,然后对播放的主要步骤进行了详细地分析。媒体播放主要分为以下几个层次:(1)提供给应用调用的native接口,这个实际上通过ohos::createplayer()调用返回playerimpl实例。
(2)playerclient,这部分通过ipc的proxy调用,向远程服务发起调用请求。
(3)playerserver,这部分是播放服务的实现端,提供给client端调用。
(4)gstreamer,这部分是提供给playerserver调用,真正实现媒体播放的功能。
原文标题:openharmony 3.2 beta多媒体系列——音视频播放框架
文章出处:【微信公众号:openatom openharmony】欢迎添加关注!文章转载请注明出处。
PRBTEK分享电流探头的具体作用
自家兄弟抢了销量,iphone将ipad“推下深渊”
功率场效应晶体管的特点_功率场效应晶体管的参数
数码录音笔的特点
卡曼滤波器入门教程一维卡曼滤波器 3
OpenHarmony 3.2 Beta多媒体系列——音视频播放框架
兆芯国产通用CPU成功适配统一操作系统 产品性能较上一代显著提升
通过nanoPower技术让智能能源更加智能
2018年LED照明市场现状分析
罗技G pro键鼠套装好不好用?罗技G pro键鼠深度评测
用于探测3D微组织流变性的光驱动生物执行器
放弃 RNN 和 LSTM 吧,它们真的不好用
利用进化后的反向传播算法实现快速、高效的训练
详细解答精选PCB设计中的九个经典问题
割地赔款!苹果或已向高通支付335-402亿元,以达成“全面和解”
LED显示屏故障处理流程
深兰科技获2亿元A+轮融资_产学研结合建设AI人才高地
浅析计算机网络的几种传输介质
中科芯与IAR共建生态合作,IAR集成开发环境全面支持CKS32系列MCU
监控安装系统工程光纤使用说明