[TIMOB-23907] Hyperloop: Incomplete Metadata for forward declarations
| GitHub Issue | n/a |
|---|---|
| Type | Bug |
| Priority | High |
| Status | Closed |
| Resolution | Fixed |
| Resolution Date | 2017-03-03T14:02:57.000+0000 |
| Affected Version/s | n/a |
| Fix Version/s | Hyperloop 2.0.1 |
| Components | n/a |
| Labels | n/a |
| Reporter | Jan Vennemann |
| Assignee | Jan Vennemann |
| Created | 2016-09-15T06:45:42.000+0000 |
| Updated | 2017-03-16T13:11:47.000+0000 |
Description
The class SFSpeechRecognitionResult will be detected but does not include any metadata info about its properties or methods. This is due to the class being forward declared with
@class and then being used as a block argument which is an argument to another method.
SFSpeechRecognizer.h:
- (SFSpeechRecognitionTask *)recognitionTaskWithRequest:(SFSpeechRecognitionRequest *)request resultHandler:(void (^)(SFSpeechRecognitionResult * __nullable result, NSError * __nullable error))resultHandler;
This probably applies to other Classes that are also referenced like this.
Attachments
| File | Date | Size |
|---|---|---|
| one_more_thing.mp3 | 2017-02-23T14:56:48.000+0000 | 220472 |
PR (master): https://github.com/appcelerator/hyperloop.next/pull/125 PR (2_0_X): https://github.com/appcelerator/hyperloop.next/pull/126 *Testing steps* 1. Create a new Hyperloop enabled classic app 2. Add a usage description to the plist section in tiapp.xml
3. Paste the following code in your app.js4. Save the attached audio file undervar SFSpeechRecognizer = require("Speech/SFSpeechRecognizer"); var SFSpeechURLRecognitionRequest = require("Speech/SFSpeechURLRecognitionRequest"); var NSBundle = require('Foundation/NSBundle'); var NSLocale = require("Foundation/NSLocale"); var NSURL = require('Foundation/NSURL'); var speechRecognizer = SFSpeechRecognizer.alloc().initWithLocale(NSLocale.alloc().initWithLocaleIdentifier("en_US")); if (speechRecognizer.isAvailable()) { var soundPath = NSBundle.mainBundle.pathForResourceOfType("one_more_thing", "mp3"); var soundURL = NSURL.fileURLWithPath(soundPath); var request = SFSpeechURLRecognitionRequest.alloc().initWithURL(soundURL); speechRecognizer.recognitionTaskWithRequestResultHandler(request, function(result, error) { Ti.API.debug(result.bestTranscription.formattedString); Ti.API.debug(result.isFinal()); }); } else { Ti.API.info('Speech recognizer not available'); }Resources/iphone5. Build and launch the app on a device. It should log the transcriptions process.Verified as fixed, testing with both Hyperloop module versions 2.0.1 & 2.1.0, the demo code provided above now transcribes audio files correctly. Tested On: Hyperloop Module (2.0.1 / 2.1.0) CocoaPods 1.2.0 iPhone 7 10.2 Device Mac OS Sierra (10.12.2) Ti SDK: 6.0.3.v20170314141715 Appc NPM: 4.2.9-1 App CLI: 6.1.0 Xcode 8.2.1 Node v4.6.0 *Closing ticket.*