5.1 APIs, Icons, Features, Components & Permissions
# Permissions
Some of the elements that we use in React Native require setting permissions. How you do this is fairly straight-forward using Expo. The Components and APIs that need permissions will have a method to request the permission which you use before calling the API.
There is a VSCode extension called vscode-expo
that will give you code completion for editing the app.json
and app.config.js
files.
Most of the settings for your app, including permissions will be set through the app.json
file at the root of your project. This is just like the config.xml and package.json files for a Cordova
project.
If you ever need to access the information from app.json inside your code then you can install expo-constants
and the contents of app.json file will be available through.
Guide to Constants (opens new window)
import Constants from 'expo-constants';
- Guide to editing app.json and app.config.js (opens new window)
- General Permissions Guide for Expo (opens new window)
- List of all the Apple iOS keys for iOS.infoPlist in app.json (opens new window)
For even more customization you can also use the app.config.js file (opens new window)
# ImagePicker
The ImagePicker
in Expo provides access to the system's UI for selecting images on the device. It works for both Android and iOS. Start with the import.
npx expo install expo-image-picker
Then we need to edit the app.json
file to add permissions to access the camera and camera roll on iOS.
First add a plugins section inside "expo".
{
"expo": {
"plugins": [
[
"expo-image-picker",
{
"photosPermission": "Please iOS let me access the camera roll.",
"cameraPermission": "Please iOS let me access the camera."
}
]
]
}
}
2
3
4
5
6
7
8
9
10
11
12
13
Then add inside the "ios" section:
"infoPlist": {
"NSMicrophoneUsageDescription": "Want to access your microphone because...",
"NSPhotoLibraryUsageDescription": "Want to access your photo library because...",
"NSCameraUsageDescription": "Want to use your camera because..."
}
2
3
4
5
and inside the "android" section:
"permissions": [
"CAMERA",
"READ_EXTERNAL_STORAGE",
"WRITE_EXTERNAL_STORAGE"
]
2
3
4
5
The permissions in app.json
are set at install time. When the app runs we need to ask the user permission with the async requestMediaLibraryPermissionsAsync()
method.
import * as ImagePicker from 'expo-image-picker';
//pay attention to how to import ImagePicker
export default function SomeScreen() {
const [image, setImage] = useState(null);
useEffect(() => {
//on load of this screen / component
//for Android and iOS not web
(async () => {
const { status } = await ImagePicker.requestMediaLibraryPermissionsAsync();
if (status !== 'granted') {
alert("Fine. Then you can't use my app.");
}
})();
}, []);
}
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
Once you have permission then you can open the users camera roll / image library with the launchImageLibraryAsync()
method.
const pickImage = async () => {
//could be called with a button click
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
aspect: [4, 3],
quality: 1,
});
if (!result.cancelled) {
//setImage is our state variable to save the image source
setImage(result.uri);
}
};
2
3
4
5
6
7
8
9
10
11
12
13
Within our screen/component where we are going to display the selected image, we use an <Image />
component with a source property that accepts an object with a uri
property. The value of the
uri
property is our state image
variable.
{
image && <Image source={{ uri: image }} style={{ width: 200, height: 200 }} />;
}
2
3
There is also a launchCameraAsync()
method that lets you launch the Camera from the ImagePicker. This is used in the demo repo. It is another async method that needs an options object to describe
what type of media to get and the aspect ratio and quality to use.
It returns either {cancelled:true}
or { cancelled: false, type: 'image', uri, width, height, exif, base64 }
. uri
is the source to use in your Image
component.
You can find a simple demo using the ImagePicker
and the camera APIs in the react native demo repo (opens new window) open the App.js
file and switch to the picker
branch.
# DocumentPicker
Very similar to the ImagePicker
. See the official DocumentPicker reference (opens new window).
# Geolocation Location
To use the Location
API from the Expo SDK we need to install the expo-location
module (opens new window).
npx expo install expo-location
Some components will require you to manually edit the app.json
permissions
property or the iOS.infoPlist
property (opens new window). Thankfully, with
expo-location
, both the ACCESS_COARSE_LOCATION
and ACCESS_FINE_LOCATION
are implied and added to your app's permissions automatically.
To use the Expo location API, you will need to first do an asynchronous call for the permission with Location.requestForegroundPermissionsAsync()
and then you can make your asynchronous call for
location with Location.getCurrentPositionAsync()
.
When doing this through React Native CLI you will need to add permissions. For iOS you always need to provide a reason for the location permission. Here are the permissions added in app.json
.
{
"expo": {
"android": {
"permissions": ["ACCESS_COARSE_LOCATION", "ACCESS_FINE_LOCATION"]
},
"ios": {
"infoPlist": {
"NSLocationAlwaysUsageDescription": "We need to display your location on the home screen."
}
}
}
}
2
3
4
5
6
7
8
9
10
11
12
To see the demo code for the location API, go to the rn-demoapps repo (opens new window), navigate to the App.js file and change the branch to location.
# Cellular and Network
For the Cellular and Network APIs we need to install two components.
npx expo install expo-cellular expo-network
Then you will need to import both in your screen/component.
import * as Cellular from 'expo-cellular';
import * as Network from 'expo-network';
2
For Android add this permission if using React Native cli.
{
"android": {
"permissions": ["READ_PHONE_STATE"]
}
}
2
3
4
5
To see the demo code for Cellular
, Network
, Application
, plus Platform
, go to the rn-demoapps repo (opens new window), navigate to the App.js file and then change
the branch to cellnet.
# ScreenOrientation
The screen orientation API from Expo allows us to determine the current screen orientation, which is an Orientation
enumeration (0 - UNKNOWN, 1 - PORTRAIT_UP, 2 - PORTRAIT_DOWN, 3 - LANDSCAPE_LEFT,
4 - LANDSCAPE_RIGHT), and the lock orientation. It also allows us to lock the orientation in a specific direction or add | remove listeners for orientation change events.
First you need to install the API.
npx expo install expo-screen-orientation
Then, if you want to support iOS, go to the app.json
file and add:
{
"expo": {
"ios": {
"requireFullScreen": true
}
}
}
2
3
4
5
6
7
Warning
On iPad, since iOS 9, with the available split view mode, orientation is always landscape unless you have two apps open in split view mode.
In your React files you can import the API like this:
import * as ScreenOrientation from 'expo-screen-orientation';
For getting the orientation call ScreenOrientation.getOrientationAsync()
which returns a promise that resolves to an Orientation enumeration object.
To add listeners use ScreenOrientation.addOrientationChangeListener(listener)
where the listener is your function. It will be passed an OrientationChangeEvent
with two properties -
orientationInfo
and orientationLock
.
To see the demo code for ScreenOrientation
go to the rn-demoapps repo (opens new window), navigate to the App.js file and then change the branch to orient.
# Camera
For Android there is an alternate API called Camera2
and you can access it with the useCamera2Api
prop.
The Camera API does NOT work on the iOS Simulator and only works on some Android Emulators. The best way to test your app and the Camera API is with a physical device.
Start by installing the expo API.
npx expo install expo-camera
Then import the camera object into your code.
import { Camera } from 'expo-camera';
To use the camera for pictures you need the Permissions.CAMERA
and for video you also need Permissions.AUDIO_RECORDING
. In your code you will have to call the
Camera.requestCameraPermissionsAsync()
method before you can use the Camera.
Use a state variable to track if the permission is granted, denied, or not decided yet. Then that can be used to display your interface appropriately.
In your Component return you can create a Camera
object. It can have a style object to provide dimensions for the Camera preview. It will also need a type property for selecting the front or back
camera.
You will also need to get a Ref to the Camera Component so you can call the methods like takePictureAsync
.
It is worth noting, that like Cordova, the image you take will be saved in the app cache, not the camera roll. You will need to use the FileSystem
to copy the image to a permanent location.
To see a working demo with the Camera
go to the react native demo repo (opens new window) open the App.js
file and switch to the camera
branch.
Here is a good tutorial from FreeCodeCamp about using the Expo camera with React Native (opens new window). Bear in mind that it is a couple years old and has some older conventions in the code.
# KeyboardAvoidingView
The KeyboardAvoidingView
is a View element that can move out of the way when the keyboard opens in response to the user tapping on a TextInput
component.
Reference for KeyboardAvoidingView (opens new window)
The Keyboard
element lets you add listeners for when the keyboard is opened or closed. It also lets you call a dismiss()
method to hide it.
reference for the Keyboard module (opens new window).
In the Simulator/Emulator the keyboard doesn't appear because you are using a physical keyboard to type. For testing, use a physical device.
You can find a simple demo using the KeyboardAvoidingView
and the Keyboard
APIs in the react native demo repo (opens new window) open the App.js
file and switch to
the avoiding
branch.
# ActivityIndicator
Built into React Native is the ActivityIndicator (opens new window) component. It will show a circular loading indicator. When you don't want to build your own spinner, this one works well.
It has a few properties that you can use to customize its appearance.
import { ActivityIndicator, View } from 'react-native';
const App = () => (
<View style={styles.container}>
<ActivityIndicator />
<ActivityIndicator size="large" />
<ActivityIndicator size="small" color="#0000ff" />
<ActivityIndicator size="large" color="#00ff00" />
</View>
);
2
3
4
5
6
7
8
9
10
If you use a ref
for the ActivityIndicator
then you can also toggle the boolean property animating
to show and hide it. However, in nearly all cases it is better to use a state variable or prop
and use conditional rendering to show or not show the component.
return (
{
if(isLoading === true){
<ActivityIndicator />
}else{
<MyFlatListWithDataComponent />
}
}
)
2
3
4
5
6
7
8
9
As a descendant of View, the ActivityIndicator
inherits all the common View props. See the full list here (opens new window)
No demo for this, just the code snippet.
# Spinner from NativeBase
If you looked at the NativeBase component library last week and liked how it was made, you can use the Spinner from NativeBase.
Native Base Loading Spinner video (opens new window)
Update
As previously discussed, NativeBase was replaced by GlueStack UI
. So, here is the GlueStack spinner reference (opens new window)
# LayoutAnimation
The LayoutAnimation
API is built into React Native and automatically animates VIEWS to their new positions when the "next layout" happens. A layout can be thought of as a change in content on the
screen that requires a repaint. The LayoutAnimation
API has a configureNext()
method that lets you describe how you want the change to happen, or what animation to run on the "next layout".
There is nothing to install but you will have to import LayoutAnimation
and UIManager
from react-native
.
On Android, to use it directly you will have to add this one snippet OUTSIDE your component function.
if (Platform.OS === 'android') {
if (UIManager.setLayoutAnimationEnabledExperimental) {
UIManager.setLayoutAnimationEnabledExperimental(true);
}
}
2
3
4
5
Full Reference for LayoutAnimation (opens new window).
The primary method that you will use is the configureNext(config, onAnimationDidEnd?, onAnimationDidFail?)
method. It has a required config parameter which is the type of animation to use for the
"next layout". The config parameter is an object with 4 properties - duration
, create
, update
, and delete
.
Create
, update
, and delete
are optional config objects for what to do when animating new views, animating views that have been updated, and animating views as they are removed. All three have a
required type
property and optional property
, springDamping
, initialVelocity
, delay
and duration
properties.
Because it can be a lot of work to set up and edit all these config values, LayoutAnimation
does contain
a few preset values (opens new window) that you can use like this.
<Pressable
onPress={() => {
LayoutAnimation.configureNext(LayoutAnimation.Presets.spring);
//other code
}}
>
2
3
4
5
6
The two optional parameters onAnimationDidEnd
and onAnimationDidFail
are to define callback functions that will run when the animation was complete or failed.
It also has a create()
method which you could use to create your own preset values to be reused with multiple animations.
The LayoutAnimation
uses the opacity
, scaleX
, scaleY
, and scaleXY
properties to do its animations. This makes it much simpler to implement but more limited than what you would do with the
Animated
API.
You can find a simple demo using the LayoutAnimation
API in the react native demo repo (opens new window) open the App.js
file and switch to the layout
branch.
# Contacts
See the official Contacts reference (opens new window)
# CaptureRef
See the official CaptureRef reference (opens new window)
# DateTimePicker
See the official DateTimePicker reference (opens new window)
# FirebaseCore
See the official FirebaseCore reference (opens new window) to get started connecting to your Firebase account.
# FaceDetector
See the official FaceDetector reference (opens new window).