React Native 允许在 iOS 中使用触觉反馈。

huangapple go评论74阅读模式
英文:

React Native Allowing Haptics in iOS

问题

我想测试我的React Native项目中的allowHapticsAndSystemSoundsDuringRecording变量。我知道有一个实例方法可以用来更改这个值,但我不确定如何在我为测试创建的.mm文件中使用它。我对Objective C或Swift不太熟悉,所以这使事情变得更加困难。在.mm文件中只是输入setAllowHapticsAndSystemSoundsDuringRecording(true);会引发错误,因为该方法未被识别。这是否是我需要从某处导入的内容?我理解我可能需要创建某个实例,但我不确定应该是什么。

英文:

I want to test the allowHapticsAndSystemSoundsDuringRecording variable in my react native project. I know that there is an instance method that allows you to change this value but I'm not sure how to use it in the .mm file I've created for testing. I'm not familiar with Objective C or Swift so it makes things a bit more difficult. Just typing out setAllowHapticsAndSystemSoundsDuringRecording(true); in a .mm file throws an error cos the method is not recognized. Is this something I need to import from somewhere? I get that I probably have to create an instance of something but I'm not sure what that would be.

答案1

得分: 2

setAllowHapticsAndSystemSoundsDuringRecording 是一个与 iOS 中的 AVAudioSession 类相关联的方法,它不是 React Native 的函数,也不能直接在 React Native 的 JavaScript 文件中使用。

使用这个方法意味着你要进入原生代码,要么是 Objective-C,要么是 Swift,你需要熟悉如何在 React Native 的 JavaScript 代码和原生代码之间进行桥接。你理解得正确,这个方法需要在 AVAudioSession 的实例上调用。

以下是如何在 Objective-C 中使用的一个简单示例,在 .mm 文件中:

首先,在你的 React Native 项目的 ios 文件夹中创建一个名为 MyAudioSessionManager.m 的新文件。

// MyAudioSessionManager.m
#import <React/RCTBridgeModule.h>
#import <AVFoundation/AVFoundation.h>

@interface RCT_EXTERN_MODULE(MyAudioSessionManager, NSObject)
RCT_EXTERN_METHOD(setAllowHapticsAndSystemSounds:(BOOL)allow resolver:(RCTPromiseResolveBlock)resolve rejecter:(RCTPromiseRejectBlock)reject)
@end

然后创建 MyAudioSessionManager 的实际实现:

// MyAudioSessionManager.mm
#import "MyAudioSessionManager.h"

@implementation MyAudioSessionManager

RCT_EXPORT_MODULE();

RCT_EXPORT_METHOD(setAllowHapticsAndSystemSounds:(BOOL)allow resolver:(RCTPromiseResolveBlock)resolve rejecter:(RCTPromiseRejectBlock)reject) {
  AVAudioSession *session = [AVAudioSession sharedInstance];
  NSError *error;

  if ([session setCategory:AVAudioSessionCategoryPlayAndRecord mode:AVAudioSessionModeDefault options:AVAudioSessionCategoryOptionAllowBluetoothA2DP | AVAudioSessionCategoryOptionDefaultToSpeaker | AVAudioSessionCategoryOptionAllowAirPlay error:&error]) {
    if (@available(iOS 13.0, *)) {
      [session setAllowHapticsAndSystemSoundsDuringRecording:allow error:&error];
    } else {
      reject(@"Not available", @"setAllowHapticsAndSystemSoundsDuringRecording is not available for this iOS version", error);
      return;
    }

    [session setActive:YES error:nil];
    resolve(@(YES));
  } else {
    reject(@"Error", @"Error setting up audio session category", error);
  }
}

@end

这只是一个简单的用法,没有处理所有可能的错误。实际的实现应该谨慎处理任何可能的错误。

最后,要将这个功能暴露给 React Native,你需要使用 React Native 的桥接功能。你可以创建一个新的原生模块或方法,将其暴露给 React Native,然后从你的 JavaScript 代码中调用它。

示例:

import { NativeModules } from 'react-native';

// 在组件或函数内部
NativeModules.MyAudioSessionManager.setAllowHapticsAndSystemSounds(true)
  .then(() => console.log('Haptics and System Sounds set successfully'))
  .catch((error) => console.error('Error setting Haptics and System Sounds:', error));

你可以在官方 React Native 文档中阅读更多关于 React Native 的原生模块的信息:Native Modules (iOS)

请记住,直接操作 AVAudioSession 是一个相对高级的操作,需要仔细处理错误,并了解 iOS 音频行为,所以只有在确定有必要的情况下才这样做。在许多情况下,你可能可以使用 React Native 提供的更高级的API或社区中可用的库来完成你的需求。

英文:

setAllowHapticsAndSystemSoundsDuringRecording is a method that is associated with the AVAudioSession class in iOS, and is not a React Native function or something you can directly use in a React Native JavaScript file.

The usage of this method means you're diving into native code, either Objective-C or Swift, and you have to be familiar with how to bridge between JavaScript in React Native and the native code. You're correct in understanding that this method needs to be called on an instance of AVAudioSession.

Here's a rough example of how you might use this in Objective-C, in the .mm file:

First, create a new file named MyAudioSessionManager.m in the ios folder of your React Native project.

// MyAudioSessionManager.m
#import &lt;React/RCTBridgeModule.h&gt;
#import &lt;AVFoundation/AVFoundation.h&gt;

@interface RCT_EXTERN_MODULE(MyAudioSessionManager, NSObject)
RCT_EXTERN_METHOD(setAllowHapticsAndSystemSounds:(BOOL)allow resolver:(RCTPromiseResolveBlock)resolve rejecter:(RCTPromiseRejectBlock)reject)
@end

Then create the actual implementation of MyAudioSessionManager:

// MyAudioSessionManager.mm
#import &quot;MyAudioSessionManager.h&quot;

@implementation MyAudioSessionManager

RCT_EXPORT_MODULE();

RCT_EXPORT_METHOD(setAllowHapticsAndSystemSounds:(BOOL)allow resolver:(RCTPromiseResolveBlock)resolve rejecter:(RCTPromiseRejectBlock)reject) {
  AVAudioSession *session = [AVAudioSession sharedInstance];
  NSError *error;

  if ([session setCategory:AVAudioSessionCategoryPlayAndRecord mode:AVAudioSessionModeDefault options:AVAudioSessionCategoryOptionAllowBluetoothA2DP | AVAudioSessionCategoryOptionDefaultToSpeaker | AVAudioSessionCategoryOptionAllowAirPlay error:&amp;error]) {
    if (@available(iOS 13.0, *)) {
      [session setAllowHapticsAndSystemSoundsDuringRecording:allow error:&amp;error];
    } else {
      reject(@&quot;Not available&quot;, @&quot;setAllowHapticsAndSystemSoundsDuringRecording is not available for this iOS version&quot;, error);
      return;
    }

    [session setActive:YES error:nil];
    resolve(@(YES));
  } else {
    reject(@&quot;Error&quot;, @&quot;Error setting up audio session category&quot;, error);
  }
}

@end

This is a simple usage and does not handle all possible errors. The actual implementation should take care to handle any possible errors properly.

Finally, to expose this functionality to React Native, you'll need to use React Native's bridging capabilities. You can create a new native module or method, expose it to React Native, and then call it from your JavaScript code.

Example:

import { NativeModules } from &#39;react-native&#39;;

// Inside a component or function
NativeModules.MyAudioSessionManager.setAllowHapticsAndSystemSounds(true)
  .then(() =&gt; console.log(&#39;Haptics and System Sounds set successfully&#39;))
  .catch((error) =&gt; console.error(&#39;Error setting Haptics and System Sounds:&#39;, error));

You can read more about native modules in React Native in the official React Native documentation: Native Modules (iOS).

Remember, manipulating AVAudioSession directly is a relatively advanced operation that requires careful error handling and understanding of iOS audio behavior, so only do this if you're sure it's necessary. In many cases, you might be able to accomplish what you need using higher-level APIs provided by React Native or by libraries available in the community.

huangapple
  • 本文由 发表于 2023年6月2日 01:50:01
  • 转载请务必保留本文链接:https://go.coder-hub.com/76384475.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定