Titanium JIRA Archive
Titanium SDK/CLI (TIMOB)

[TIMOB-23907] Hyperloop: Incomplete Metadata for forward declarations

GitHub Issuen/a
TypeBug
PriorityHigh
StatusClosed
ResolutionFixed
Resolution Date2017-03-03T14:02:57.000+0000
Affected Version/sn/a
Fix Version/sHyperloop 2.0.1
Componentsn/a
Labelsn/a
ReporterJan Vennemann
AssigneeJan Vennemann
Created2016-09-15T06:45:42.000+0000
Updated2017-03-16T13:11:47.000+0000

Description

The class SFSpeechRecognitionResult will be detected but does not include any metadata info about its properties or methods. This is due to the class being forward declared with @class and then being used as a block argument which is an argument to another method. SFSpeechRecognizer.h:
- (SFSpeechRecognitionTask *)recognitionTaskWithRequest:(SFSpeechRecognitionRequest *)request resultHandler:(void (^)(SFSpeechRecognitionResult * __nullable result, NSError * __nullable error))resultHandler;
This probably applies to other Classes that are also referenced like this.

Attachments

FileDateSize
one_more_thing.mp32017-02-23T14:56:48.000+0000220472

Comments

  1. Jan Vennemann 2017-02-23

    PR (master): https://github.com/appcelerator/hyperloop.next/pull/125 PR (2_0_X): https://github.com/appcelerator/hyperloop.next/pull/126 *Testing steps* 1. Create a new Hyperloop enabled classic app 2. Add a usage description to the plist section in tiapp.xml
       <key>NSSpeechRecognitionUsageDescription</key>
       <string>Use speech recognition</string>
       
    3. Paste the following code in your app.js
       var SFSpeechRecognizer = require("Speech/SFSpeechRecognizer");
       var SFSpeechURLRecognitionRequest = require("Speech/SFSpeechURLRecognitionRequest");
       var NSBundle = require('Foundation/NSBundle');
       var NSLocale = require("Foundation/NSLocale");
       var NSURL = require('Foundation/NSURL');
       var speechRecognizer = SFSpeechRecognizer.alloc().initWithLocale(NSLocale.alloc().initWithLocaleIdentifier("en_US"));
       if (speechRecognizer.isAvailable()) {
         var soundPath = NSBundle.mainBundle.pathForResourceOfType("one_more_thing", "mp3");
         var soundURL = NSURL.fileURLWithPath(soundPath);
         var request = SFSpeechURLRecognitionRequest.alloc().initWithURL(soundURL);
         speechRecognizer.recognitionTaskWithRequestResultHandler(request, function(result, error) {
           Ti.API.debug(result.bestTranscription.formattedString);
           Ti.API.debug(result.isFinal());
         });
       } else {
         Ti.API.info('Speech recognizer not available');
       }
       
    4. Save the attached audio file under Resources/iphone 5. Build and launch the app on a device. It should log the transcriptions process.
  2. Harry Bryant 2017-03-16

    Verified as fixed, testing with both Hyperloop module versions 2.0.1 & 2.1.0, the demo code provided above now transcribes audio files correctly. Tested On: Hyperloop Module (2.0.1 / 2.1.0) CocoaPods 1.2.0 iPhone 7 10.2 Device Mac OS Sierra (10.12.2) Ti SDK: 6.0.3.v20170314141715 Appc NPM: 4.2.9-1 App CLI: 6.1.0 Xcode 8.2.1 Node v4.6.0 *Closing ticket.*

JSON Source