Browse Author: Snehal

Quick Actions In Swift

Apple has added quick actions to the app icon so users can deep link into an area of your app quicker. By pressing the app icon of the latest devices, the user obtains a set of quick actions. When the user selects a quick action, your app activates or launches and your app delegate object receives the quick action message.

DEFINE QUICK ACTIONS

In app’s Info.plist file create a UIApplicationShortcutItems array. This is where we define what the actions are, the title, subtitle, and short cut keys for each. Note that you can only have a max of 4 quick actions off the icon in app.

UIApplicationShortcutItemType – A required string delivered to your app when the user invokes the corresponding quick action.

UIApplicationShortcutItemTitle – A required string displayed to the user on the Home screen as the name of the quick action.

UIApplicationShortcutItemSubtitle – An optional string that is displayed to the user on the Home screen, immediately below the corresponding title string.

UIApplicationShortcutItemIconType – An optional string specifying the type of an icon from the system-provided library.

UIApplicationShortcutItemIconFile – An optional string specifying an icon image to use from the app’s bundle, or the name of an image in an asset catalog.

UIApplicationShortcutItemUserInfo – An optional, app-defined dictionary. One use for this dictionary is to provide app version information.

Next you can run your app and test to see if the quick actions are formatted the way you expected them to look.

Note: You must develop on a device that supports 3D Touch. The simulator in Xcode does not support 3D Touch.

HANDLE THE SHORT CUTS

Begin by adding the enum and properties we are going to need in the methods. If you don’t want to use enums, then make sure your names match the UIApplicationShortcutItemType values entered in your Info.plist.

Read in the UIApplicationShortcutItem that is selected by the User in the didFinishLaunchingWithOptions method. Here we are saving that value into launchedShortcutItem so we can handle it next.

The next method to get called is applicationDidBecomeActive, this method gets called after didFinishLaunchingWithOptions during the first launch of your app, or every time the user comes into your app while it’s still open in the background.

When a user chooses one of the quick actions the app launches or resumes the app and calls the performActionForShortcutItem method in your app delegate.

Lastly we need to handle the short cut and deep link the user into the proper view controller within our app.

You can download completed quick action demo project here.

Scanning QR Code Using AVFoundation Framework

QR (short for Quick Response) code. QR code has gained popularity in consumer space in recent years as a way to encode URL of a landing page or marketing information. Unlike the basic barcode that we’re familiar with QR code contains information in both horizontal and vertical direction. Thus this contributes to its capability of storing larger amount of data in both numeric and letter form.In this tutorial we will build a similar QR code reader app in Swift. After going through the tutorial you will understand how to use the AVFoundation framework to discover and read QR code in real-time. Any barcode scanning in iOS including QR code is totally based on video capturing. That’s why the barcode scanning feature is added in the AVFoundation framework.The app works pretty much like a video capturing app but without recording feature. When the app is launched it takes advantage of the iPhone’s rear camera to spot the QR code and recognises it automatically. 

I have created the user interface of the app in the project template. The label in the UI will be used to display decoded information of QR code and it is associated with the msglbl property of the ViewController class.

As mentioned we rely on the AVFoundation framework to implement the QR code scanning feature.

1) First, open ViewController.swift and import the framework:

     import AVFoundation

2) Later, we’ll need to implement the AVCaptureMetadataOutputObjectsDelegate protocol.

      class ViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate

3) Instantiate an AVCaptureSession object with the input set to the appropriate AVCaptureDevice for video capturing. Since we are going to capture video data we call the default(for:) method passing it AVMediaTypeVideo to get the video capture device. We instantiate an AVCaptureSession object and add the input of the video capture device. The AVCaptureSession object is used to coordinate the flow of data from the video input device to our output.

The output of the session is set to an AVCaptureMetaDataOutput object. The AVCaptureMetaDataOutput class is the core part of QR code reading.

This class in combination with the AVCaptureMetadataOutputObjectsDelegate protocol is used to intercept any metadata found in the input device (the QR code captured by the device’s camera) and translate it to a human-readable format.

4) The metadataObjectTypes property is also quite important as this is the point where we tell the app what kind of metadata we are interested in. The AVMetadataObjectTypeQRCode clearly indicates our purpose.We have set and configured an AVCaptureMetadataOutput object we need to display the video captured by the device’s camera on screen. This can be done using an AVCaptureVideoPreviewLayer which actually is a CALayer. You use this preview layer in conjunction with an AV capture session to display video. The preview layer is added as a sublayer of the current view. Start the video capture by calling the startRunning method of the capture session.

5)  When the AVCaptureMetadataOutput object recognises a QR code the following delegate method of AVCaptureMetadataOutputObjectsDelegate will be called:

       func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection)

The second parameter (i.e. metadataObjects) of the method is an array object which contains all the metadata objects that have been read.

6) Lastly we decode the QR code into human-readable information.The decoded information can be accessed by using the stringValue property of an AVMetadataMachineReadableCode object.iOS requires app developers to obtain the user’s permission before allowing to access the camera. To do so you have to add a key named NSCameraUsageDescription in the Info.plist file. Open the file and Set the key to Privacy – Camera Usage Description and value to We need to access your camera for scanning QR code.

7) Now you’re ready to go! Hit the Run button to compile and run the app on a real device. Once launched tap the scan button and then point the device to the QR code.The app immediately detects the code and decodes the information.

You can download the completed project here