Titanium JIRA Archive
Titanium SDK/CLI (TIMOB)

[TIMOB-24909] iOS: Move Ti.Media.AudioPlayer to AVPlayer internally, remove external library-dependency

GitHub Issuen/a
TypeImprovement
PriorityHigh
StatusClosed
ResolutionFixed
Resolution Date2018-07-28T17:39:38.000+0000
Affected Version/sn/a
Fix Version/sRelease 7.5.0
ComponentsiOS
Labelsaudioplayer, demo_app
ReporterHans Knöchel
AssigneeHans Knöchel
Created2017-06-29T19:44:04.000+0000
Updated2018-10-10T12:48:38.000+0000

Description

Right now, we use the (outdated and unmaintained) iOS library "AudioStreamer" for our Ti.Media.AudioPlayer API. Nowadays, modern iOS API's help us, especially the AVPlayer inside AVFoundation. This will also resolve a number of tickets with a high numbers of watchers: - TIMOB-19519 - TIMOB-3375 The time-frame should be around SDK release 7.1.0 / 7.2.0 with 100 % backwards-compatibility. See the [Ti.Media.AudioPlayer](http://docs.appcelerator.com/platform/latest/#!/api/Titanium.Media.AudioPlayer) docs for more infos.

Attachments

FileDateSize
sample.mp32018-01-15T08:28:03.000+000037596

Comments

  1. Patrick Mounteney 2018-01-13

    I must admit I gave up on this and implemented by own player in Hyperloop using AVPlayer. It works quite well but for not being able to implement any kind of listeners to get feedback from the the player. AVPlayer has no delegate and Apple expect us to use KVO, but nobody has been able to tell me how to do that it Hyperloop! I ended up using a 3 second setInterval to poll the player's 'timeControlStatus' property. So a new version of Ti.Media.AudioPlayer using AVPlayer with player monitoring would be good!
  2. Hans Knöchel 2018-01-14

    PR: https://github.com/appcelerator/titanium_mobile/pull/9732 Test-Cases: *#1: Remote file, some events*
       var win = Ti.UI.createWindow({
           title: 'Audio Test',
           backgroundColor: '#fff',
           layout: 'vertical'
       });
       
       var startStopButton = Ti.UI.createButton({
           title: 'Start/Stop Streaming',
           top: 10,
       });
       
       var pauseResumeButton = Ti.UI.createButton({
           title: 'Pause/Resume Streaming',
           top: 10,
           enabled: false
       });
       
       var changeURLButton = Ti.UI.createButton({
           title: 'Change URL',
           top: 10
       });
       
       win.add(startStopButton);
       win.add(pauseResumeButton);
       win.add(changeURLButton);
       
       // allowBackground: true on Android allows the
       // player to keep playing when the app is in the
       // background.
       var audioPlayer = Ti.Media.createAudioPlayer({
           url: 'http://www.noiseaddicts.com/samples_1w72b820/13.mp3',
           allowBackground: true
       });
       
       startStopButton.addEventListener('click',function() {
           // When paused, playing returns false.
           // If both are false, playback is stopped.
           if (audioPlayer.playing || audioPlayer.paused) {
               audioPlayer.stop();
               pauseResumeButton.enabled = false;
               if (Ti.Platform.name === 'android')
               {
                   audioPlayer.release();
               }
           } else {
               audioPlayer.start();
               pauseResumeButton.enabled = true;
           }
       });
       
       pauseResumeButton.addEventListener('click', function() {
           if (audioPlayer.paused) {
               audioPlayer.start();
           } else {
               audioPlayer.pause();
           }
       });
       
       changeURLButton.addEventListener('click', function() {
         audioPlayer.setUrl('sample.mp3');
       });
       
       audioPlayer.addEventListener('progress', function(e) {
           Ti.API.info('Time Played: ' + Math.round(e.progress) + ' milliseconds');
       });
       
       audioPlayer.addEventListener('change', function(e) {
           Ti.API.info('State: ' + e.description + ' (' + e.state + ')');
       });
       
       win.addEventListener('close',function() {
           audioPlayer.stop();
           if (Ti.Platform.osname === 'android')
           {
               audioPlayer.release();
           }
       });
       
       win.open();
       
    *#2: Real time radio streams*:
       var win = Ti.UI.createWindow({
           title: 'Audio Test',
           backgroundColor: '#fff',
           layout: 'vertical'
       });
       
       var startStopButton = Ti.UI.createButton({
           title: 'Start/Stop Streaming',
           top: 10,
           width: 200,
           height: 40
       });
       
       var pauseResumeButton = Ti.UI.createButton({
           title: 'Pause/Resume Streaming',
           top: 10,
           width: 200,
           height: 40,
           enabled: false
       });
       
       win.add(startStopButton);
       win.add(pauseResumeButton);
       
       var audioPlayer = Ti.Media.createAudioPlayer({
           url: 'http://ca2.rcast.net:8044/'
       });
       
       startStopButton.addEventListener('click',function() {
           // When paused, playing returns false.
           // If both are false, playback is stopped.
           Ti.API.info('PLAYING = ' + audioPlayer.playing);
           Ti.API.info('PAUSED = ' + audioPlayer.paused);
       
           if (audioPlayer.playing || audioPlayer.paused) {
               audioPlayer.stop();
               pauseResumeButton.enabled = false;
           } else {
               audioPlayer.start();
               pauseResumeButton.enabled = true;
           }
       });
       
       pauseResumeButton.addEventListener('click', function() {
         Ti.API.info('PLAYING = ' + audioPlayer.playing);
         Ti.API.info('PAUSED = ' + audioPlayer.paused);
       
           if (audioPlayer.paused) {
               audioPlayer.start();
           } else {
               audioPlayer.pause();
           }
       });
       
       audioPlayer.addEventListener('progress', function(e) {
           Ti.API.info('Time Played: ' + Math.round(e.progress) + ' milliseconds');
       });
       
       audioPlayer.addEventListener('change', function(e) {
           Ti.API.info('State: ' + e.description + ' (' + e.state + ')');
       });
       
       audioPlayer.addEventListener('metadata', function(e) {
         Ti.API.info(e);
       });
       
       win.addEventListener('close',function() {
           audioPlayer.stop();
       });
       
       win.open();
       
  3. Hans Knöchel 2018-04-03

    For the watchers of this ticket: Now is the chance to propose API's that could be added with AVPlayer. Let me know if there are specific functionalities being useful for the general developer.
  4. Patrick Mounteney 2018-04-03

    Thanks for reaching out Hans. For me, as I am using AudioPlayer for radio streaming, being able to picked timedMetadata for the playing artist & title is important. In fact so much so that I have written my own module using AVPlayer (which I am still testing). But as you are way ahead of me on the Objective-C skills I am happy to use the updated official Appcelerator AudioPlayer API when it is released. Have you any rough idea of when SKD 7.2.0 is likely to happen? Cheers.
  5. Hans Knöchel 2018-04-03

    Thanks [~patrickmounteney]. The timedMetadata API is huge. What API's from there are relevant? Only the value of each item in the array? *EDIT*: Looks like key, keySpace, value & extraAttributes are pretty powerful already (some debug output from Xcode):
       (lldb) po playerItem.timedMetadata
       <__NSArrayI 0x7f8898e4afe0>(
       <AVMetadataItem: 0x7f8898e4b300, identifier=common/title, keySpace=comn, key class = __NSCFConstantString, key=title, commonKey=title, extendedLanguageTag=(null), dataType=(null), time={88704/44100 = 2.011}, duration={INVALID}, startDate=(null), extras={
       }, value=Lar Wolkan - Too Much of Not Enough>,
       <AVMetadataItem: 0x7f889911a870, identifier=common/publisher, keySpace=comn, key class = __NSCFConstantString, key=publisher, commonKey=publisher, extendedLanguageTag=(null), dataType=(null), time={88704/44100 = 2.011}, duration={INVALID}, startDate=(null), extras={
       }, value=t>
       )
       (lldb) po (AVMetadataItem *)playerItem.timedMetadata[0]
       <AVMetadataItem: 0x7f8898e4b300, identifier=common/title, keySpace=comn, key class = __NSCFConstantString, key=title, commonKey=title, extendedLanguageTag=(null), dataType=(null), time={88704/44100 = 2.011}, duration={INVALID}, startDate=(null), extras={
       }, value=Lar Wolkan - Too Much of Not Enough>
       
       (lldb) po [(AVMetadataItem *)playerItem.timedMetadata[0] key]
       title
       
       (lldb) po [(AVMetadataItem *)playerItem.timedMetadata[0] value]
       Lar Wolkan - Too Much of Not Enough
       
       (lldb) po [(AVMetadataItem *)playerItem.timedMetadata[0] keySpace]
       comn
       
       (lldb) po [(AVMetadataItem *)playerItem.timedMetadata[0] extraAttributes]
       {
       }
       
    Mapped into a Titanium event:
       items =     (
                       {
                   extraAttributes =             {
                   };
                   key = title;
                   keySpace = comn;
                   value = "Pops Staples - Somebody Was Watching";
               },
                       {
                   extraAttributes =             {
                   };
                   key = publisher;
                   keySpace = comn;
                   value = t;
               }
           );
           source = "[object TiMediaAudioPlayer]";
           type = metadata;
       
    Using a new metadata event. Timeline for 7.2.0 is over the next few months, but you can use the pull request and patch your 7.x SDK today.
  6. Patrick Mounteney 2018-04-03

    All I am doing is sticking a listener on the playerItem as per:
    [playerItem addObserver:self forKeyPath:@"timedMetadata" options:NSKeyValueObservingOptionNew context:nil];
    and picking it up in 'observeValueForKeyPath' like this:
    if ([keyPath isEqualToString:@"timedMetadata"]) {
               //NSLog(@"[INFO] timedMetadata changed");
               AVPlayerItem* playerItem = object;
               for (AVMetadataItem* metadata in playerItem.timedMetadata) {
                       [self fireEvent:@"metadata" withObject:@{@"title": metadata.stringValue}]; // fire back to host app
               }
           }
    But I have not worked with Objective-C much and I expect there is a better way to do it. Sorry - how does one get code to appear as you have it in a nice box?
  7. Hans Knöchel 2018-04-03

    Use the ```` tags for that (or select from the "+" at the right). I've just pushed [this commit](https://github.com/appcelerator/titanium_mobile/pull/9732/commits/94b748ccdfbdedae943b94616bdeac9b78acbe76) that adds the metadata` event as part of the main [pull request](https://github.com/appcelerator/titanium_mobile/pull/9732). Try it out!
  8. Patrick Mounteney 2018-04-03

    That's great Hans. I'll have a look when I have a moment.
  9. Patrick Mounteney 2018-04-04

    Another useful addition would be to make Ti.Media.AudioPlayer work with the iOS MPNowPlayingInfoCenter and the lockscreen. Something I have been unable to do in my little module. There is an existing module for this (by Foddy), but I don't think it is able to update in the background.
  10. Rene Pot 2018-05-01

    [~patrickmounteney] that would be the Ti.Media.MusicPlayer, not the audioPlayer.
  11. Patrick Mounteney 2018-05-02

    Well, in my DIY module I am using MPNowPlayingInfoCenter as per:
    MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
            [infoCenter setNowPlayingInfo:@{MPMediaItemPropertyTitle:title, MPMediaItemPropertyArtist:artist}];
    The the title and artist are obtained from AVPlayerItem's timedMetadata. And that does seem to update the lock screen whilst the player is in the background.
  12. Hans Knöchel 2018-05-02

    [~patrickmounteney] The initial implementation has been finished already, so new features won't slip into the core change of the API at this time. But you could use Hyperloop easily for this, even today already:
        var MPNowPlayingInfoCenter = require('MediaPlayer/MPNowPlayingInfoCenter');
        var MediaPlayer = require('MediaPlayer/MediaPlayer');
        var MPMediaItemPropertyTitle = MediaPlayer.MPMediaItemPropertyTitle;
        var MPMediaItemPropertyArtist = MediaPlayer.MPMediaItemPropertyArtist;
        
        // ES 6+, Hyperloop 3.1.0+
        // import { MPNowPlayingInfoCenter, MediaPlayer } from 'MediaPlayer';
        
        MPNowPlayingInfoCenter.defaultCenter. setNowPlayingInfo({
          MPMediaItemPropertyTitle: 'My Title', 
          MPMediaItemPropertyArtist:'My Artist'
        });
        
    Eventually, the JavaScript Object object needs to be mapped to a native NSDictionary, but it could even work like this already.
  13. Samir Mohammed 2018-10-10

    *Closing ticket.* Verified improvement on SDK Version 7.5.0.v20181008124804. *FR Passed (Test Steps)*

    Created a new Titanium application

    Added the first test case mentioned above in to the app.js

    Ran the program

    Started stream and was able to hear sound

    Pressed stop and sound was stopped

    Also able to pause and resume using the pause/resume stream button

    Pressed Change URL

    URL was changed and local file started to play

    Backgrounded the application and sound was still playing

    Above steps were also used for the second test case (Real time audio streams)

    *Test Environment*
        APPC Studio: 5.1.0.201808080937
        iPhone 6 Sim (12.0)
        APPC CLI: 7.0.7-master.4
        Operating System Name: Mac OS Mojave
        Operating System Version: 10.14
        Node.js Version: 8.9.1
        Xcode 10.0
        

JSON Source