Common method in Swift

1. Check for any field empty in NSDictionary

class func checkforEmptyValueinDictioanty(dic:NSDictionary)-> Bool{

for (keyVal, dataVal) in dic {

  if (dataVal.length()==0){

                println(\(keyVal): \(dataVal.length()))

                return false




        return true


2.Email validation

class func isValidEmail(testStr:String) -> Bool {

let fullNameArr = testStr.componentsSeparatedByString(“@”)

        var firstPart: String = fullNameArr[0]

        if let range = firstPart.rangeOfCharacterFromSet(NSCharacterSet.letterCharacterSet()){


            return false


      let emailRegEx = “[A-Z0-9a-z._%+-]+@[A-Za-z0-9.-]+\\.[A-Za-z]{2,4}”

 var emailTest = NSPredicate(format:“SELF MATCHES %@”, emailRegEx)

        let result = emailTest.evaluateWithObject(testStr)

        return result


3. Alert in Swift

class func commonAlert(title:String,msg:String,curView:UIViewController){

        var device : UIDevice = UIDevice.currentDevice();

        var systemVersion = device.systemVersion;

        var iosVerion : Float = (systemVersion as NSString).floatValue

        if(iosVerion >= 8.0) {


        var alert = UIAlertController(title: title, message: msg, preferredStyle: UIAlertControllerStyle.Alert)

            alert.addAction(UIAlertAction(title: “Ok”, style: UIAlertActionStyle.Default, handler: nil))

                  // return alert

            curView.presentViewController(alert, animated: true, completion: nil)


            let alert=UIAlertView(title: title, message: msg, delegate: self, cancelButtonTitle: “ok”)





4. NSUserDefaults in swift as common function

class func saveToUserDefault(value:AnyObject, key:String)


        NSUserDefaults.standardUserDefaults().setObject(value, forKey:key)




    class func userDefaultForKey(key:String) -> String


        return NSUserDefaults.standardUserDefaults().objectForKey(key) as NSString



    class func userDefaultForAny(key:String) -> AnyObject


        return NSUserDefaults.standardUserDefaults().objectForKey(key) as AnyObject!



    class func userdefaultForArray(key:String) -> Array<AnyObject>


        return NSUserDefaults.standardUserDefaults().objectForKey(key) as Array



    class func removeFromUserDefaultForKey(key:String)






5.Get screen height and width

let _screenWidth=UIScreen.mainScreen().bounds.size.width

let _screenHeight=UIScreen.mainScreen().bounds.size.height


Use different font text in a Label in swift

let secondLabel=UILabel(frame: CGRectMake(0, 16, _screenWidth2, 20))



        secondLabel.font=UIFont(name: “Arial”, size: 12)



        let attrSting=NSMutableAttributedString(string: “I agree to the Terms of Service and Privacy Policy.”)


        NSLog(“text length is %d”, attrSting.length)


        attrSting.addAttribute(NSFontAttributeName, value: UIFont(name: “Helvetica-bold”, size: 14), range: NSMakeRange(15, 16))


        attrSting.addAttribute(NSFontAttributeName, value: UIFont(name: “Helvetica”, size: 12), range: NSMakeRange(31,4 ))


        attrSting.addAttribute(NSFontAttributeName, value: UIFont(name: “Helvetica-bold”, size: 14), range: NSMakeRange(35, 15))



Use HexColor in Swift as a color

Add it in your class 

extension UIColor {

   convenience init(red: Int, green: Int, blue: Int) {

       assert(red >= 0 && red <= 255, “Invalid red component”)

       assert(green >= 0 && green <= 255, “Invalid green component”)

       assert(blue >= 0 && blue <= 255, “Invalid blue component”)

       self.init(red: CGFloat(red) / 255.0, green: CGFloat(green) / 255.0, blue: CGFloat(blue) / 255.0, alpha: 1.0)


   convenience init(netHex:Int) {

       self.init(red:(netHex >> 16) & 0xff, green:(netHex >> 8) & 0xff, blue:netHex & 0xff)



Use it where you need…….

var color = UIColor(red: 0xFF, blue: 0xFF, green: 0xFF)

var color2 = UIColor(netHex:0xFFFFFF)


Apple has been politely suggesting that we use adaptive layouts since iOS 6, but until now I feel that people have been avoiding the topic, preferring to think mostly about fixed layouts.

With the iPhone 6 it’s about to get a lot harder to avoid using adaptive layouts1. With four screen sizes (five if you’re supporting the iPad), three resolutions and orientations to account for it just seems easier (and smarter) to start thinking about adaptive layouts from the start of your design process.

There is a awesome post related to this

A soft touch of Apple HealthKit :: HealthKit Tutorial

Screen Shot 2014-10-14 at 6.54.20 PM

HealthKit :: The HealthKit framework provides a structure that apps can use to share health and fitness data. HealthKit is designed to manage data from a wide range of sources, automatically merging the data from all the different sources based on users’ preferences. Apps can also access the raw data for each source and let the app perform its own merging.

Some Key feature related to HealthKit is following

  • To use Healthkit in your application , First you have to enable it . To  enable HeakthKit you have to perform following step .

         Project setting—>Capability pane —> make enable Healthkit

         This requires that you sign in to your Apple developer account, and   

          then acquires the  appropriate entitlements for your app.

  • The HealthKit data is not saved to iCloud or synced across multiple devices. The data is only kept locally on the user’s device. For security, the HealthKit store is encrypted when the device is not unlocked.
  • HealthKit can be used only in iPhone and iPod  , Not in iPad. Developer can check for HealthKit availability using  isHealthDataAvailable().it return a Boolean value that indicates whether HealthKit is available on this device.
  • HKHealthStore is used to managed all types of data in HealthKit.
         if let massNumber = numberString("50") {

      let weightType =   HKObjectType.quantityTypeForIdentifier(HKQuantityTypeIdentifierBodyMass)

            let weightValue = HKQuantity(unit: HKUnit(fromString: "kg"), doubleValue: massNumber)

            let metadata = [ HKMetadataKeyWasUserEntered : true ]

            let now = NSDate()

            let sample=HKQuantitySample(type: weightType, quantity: weightValue, startDate: now, endDate: now, metadata: metadata)
  • All access to the data is performed through a HKHealthStore object, and it is this that you request permissions from. access request of data from Store is done as following
private func requestAuthorisationForHealthStore() {

  let dataTypesToWrite = [



  let dataTypesToRead = [






  self.healthStore?.requestAuthorizationToShareTypes(NSSet(array: dataTypesToWrite),

    readTypes: NSSet(array: dataTypesToRead), completion: {

    (success, error) in

      if success {

        println("User completed authorisation request.")

      } else {

        println("The user cancelled the authorisation request. \(error)")



  • Samples are saved in store using HKHealthStore object . sample code for this is following


        func saveSampleToHealthStore(sample: HKObject) {

  println(“Saving weight”)

  self.healthStore?.saveObject(sample, withCompletion: {

    (success, error) in

    if success {

      println(“Weight saved successfully “)

    } else {

      println(“Error: \(error))




  • Data from Store can be retrieved from HKQuery . sample code to retrieve is following


          func perfromQueryForWeightSamples() {

  let endDate = NSDate()

  let startDate = NSCalendar.currentCalendar().dateByAddingUnit(.CalendarUnitMonth,

    value: –2, toDate: endDate, options: nil)

  let weightSampleType = HKSampleType.quantityTypeForIdentifier(HKQuantityTypeIdentifierBodyMass)

  let predicate = HKQuery.predicateForSamplesWithStartDate(startDate,

    endDate: endDate, options: .None)

  let query = HKSampleQuery(sampleType: weightSampleType, predicate: predicate,

    limit: 0, sortDescriptors: nil, resultsHandler: {

    (query, results, error) in

      if !results {

        println(“There was an error running the query: \(error))


      var data=results as [HKQuantitySample]




HealthKit Support 50 type of exercise including

     • Archery

  • Yoga
  • Swimming
  • StairClimbing
  • SnowSports
  • SkatingSports
  • Hocket
  • Fishing
  • Dance
  • CrossTraining

Type of values that can be stored in store is following

  •          1. Body Mass
  •          2. Height
  •          3.Step Count
  •          4.Blood glucose
  •          5.Blood alcohol content
  •          6.Dietary energy consumed
  •          7.Body Temperature
  •          8.Inhaler usage

         these are known as HKTypeIdentifiers.

For detail check on

Continue reading

BackWord data passing in xcode.

step 1. Define a protocol in FirstViewController.h

#import <UIKit/UIKit.h>

@protocol ViewBProtocol

– (void)setData:(NSString *)data;


@interface MSYViewController : UIViewController<ViewBProtocol>


Step 2. in FirstViewController.m

– (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender{

UIViewController* controller = [segue destinationViewController];

if ([controller isKindOfClass:[MSYQuizView class]])


MSYQuizView* viewCController = (MSYQuizView *)controller;

viewCController.delegate = self;



// where MSYQuizView is second ViewController.

// implement protocol method.

– (void)setData:(NSString *)data{

NSLog(@” data is %@ “,data);


Step 3 . In SecondViewController named MSYQuizView.h

#import <UIKit/UIKit.h>

#import “MSYViewController.h”

@interface MSYQuizView : UIViewController

@property (nonatomic, weak) id<ViewBProtocol> delegate;


Step 4. In MSYQuizView.m  ( SecondViewController).

– (void)sendData:(NSString *)data


NSLog(@”send………… %@”,data);

[self.delegate setData:data];


// Call it AnyWhere you want like this

– (void)viewDidDisappear:(BOOL)animated


[super viewDidDisappear:animated];

[self sendData:@”now success to complete”];



AirDrop in iOS 7

AirDrop Overview

Before we step into the implementation, let’s have a quick look at AirDrop. It’s very simple to use AirDrop. Simply bring up Control Center and tap AirDrop to enable it. You can either select “Contact Only” or “Everyone” depending on whom you want to share the data with. If you choose the Contact Only option, your device will only discovered by people listed in your contacts. Obviously, your device can be discovered by anyone for the Everyone option.

Airdrop Overview

AirDrop uses Bluetooth to scan for nearby devices. When a connection is established via Bluetooth, it’ll create an ad-hoc Wi-Fi network to link the two devices together, allowing for faster data transmission. It doesn’t mean you need to connect the devices to a Wi-Fi network in order to use AirDrop. Your WiFi simply needs to be on for the data transfer.

Say you want to share a photo in the Photos app from one iPhone to another. Assuming you’ve enabled AirDrop on both devices, to share the photos with another device, tap the Share button (the one with an arrow pointing up) at the lower-left of the screen.

Airdrop Overview Receiving Side

In the AirDrop area, you should see the name of the devices that are eligible for sharing. AirDrop is not available when the screen is turned off. So make sure the device on the receiving side is switched on. You can then select the device to share the photo. On the other device, you’ll see a preview of the photo and a confirmation request. The recipient can accept or decline to receive the image. If you choose the accept option, the photo is then transferred and automatically saved in the camera roll.

AirDrop doesn’t just work with the Photos app. You can also find the share option in most of the built-in apps such as Contacts, iTunes, App Store, Safari, to name a few. If you’re new to AirDrop, you should now have a better idea.

Let’s move on and see how we can incorporate AirDrop feature in your app to share various types of data.

A Quick Look at UIActivityViewController

You may think it’ll take a lot of efforts to implement the AirDrop feature. Conversely, you just need a few lines of code to add AirDrop support. The UIActivityViewController class available in iOS 7 SDK makes it super easy to integrate the feature. The AirDrop has been built into the class.

The UIActivityViewController class is a standard view controller that provides several standard services, such as copying items to the clipboard, sharing content to social media sites, sending items via Messages, etc. In iOS 7 SDK, the class comes with the AirDrop feature built-in.

Say, you have an array of objects to share using AirDrop. All you need to do is to initiate a UIActivityViewController with the array of objects and present it on screen:

UIActivityViewController *controller = [[UIActivityViewController alloc] initWithActivityItems:objectsToShare applicationActivities:nil];
[self presentViewController:controller animated:YES completion:nil];

With just two lines of code, you can bring up the activity view with AirDrop option. Whenever there is a nearby device detected, the activity controller automatically shows the device and handles the data transfer if you choose to.

UIActivityViewController AirDrop

Optionally, you can exclude certain types of activities. Say, you can just display the AirDrop activity by excluding all other activities. Use the following code:

    UIActivityViewController *controller = [[UIActivityViewController alloc] initWithActivityItems:objectsToShare applicationActivities:nil];
    NSArray *excludedActivities = @[UIActivityTypePostToTwitter, UIActivityTypePostToFacebook,
                                    UIActivityTypeMessage, UIActivityTypeMail,
                                    UIActivityTypePrint, UIActivityTypeCopyToPasteboard,
                                    UIActivityTypeAssignToContact, UIActivityTypeSaveToCameraRoll,
                                    UIActivityTypeAddToReadingList, UIActivityTypePostToFlickr,
                                    UIActivityTypePostToVimeo, UIActivityTypePostToTencentWeibo];
    controller.excludedActivityTypes = excludedActivities;
    [self presentViewController:controller animated:YES completion:nil];

The activity view controller now only shows the AirDrop option:

UIActivityViewController AirDrop Only

You can use UIActivityViewController to share different types of data including NSString, UIImage and NSURL. Not only you can use NSURL to share a link, it allows developers to transfer any types of files by using file URL.

On the receiving side, when the other device receives the data, it’ll automatically open an app based on the data type. Say, if an UIImage is transferred, the received image will be displayed in Photos app. When you transfer a PDF file, the other device will open it in Safari. If you just share a NSString object, the data will be presented in Notes app.

A Glance at the AirDrop Demo App

To give you a better idea about UIActivityViewController and AirDrop, we’ll build a AirDrop demo app. The app is very simple. When it is first launched, you’ll see a table view listing a few files including an image file, a PDF file and a text file. You can tap the file and view the content. In the content view, there is an action button on the top-right corner of screen. Tapping it will bring up the AirDrop option and you can share the image or document with nearby device.

AirDrop Demo App Workflow

You’re encouraged to build the demo app from scratch. But to save your time, you can download this project template to start with. When you open Xcode project, you should find the following Storyboard:

AirDrop Demo Storyboard

I have already implemented the ListTableViewController and DocumentViewController for you. If you compile and run the app, you’ll be presented with a list of files. When you tap any of the file, the image or document content will be displayed. But the share button is not yet implemented and that is what we’re going to talk about.

Adding AirDrop Feature

In the project template, the ListTableViewController is used to displayed the list of files in a table view, while the DocumentViewController presents the document content via a web view. The action button in the document view is associated with the share: method of the DocumentViewController class. Edit the method with the following code:

– (IBAction)share:(id)sender {
    NSURL *url = [self fileToURL:self.documentName];
    NSArray *objectsToShare = @[url];

    UIActivityViewController *controller = [[UIActivityViewController alloc] initWithActivityItems:objectsToShare applicationActivities:nil];
    // Exclude all activities except AirDrop.
    NSArray *excludedActivities = @[UIActivityTypePostToTwitter, UIActivityTypePostToFacebook,
                                    UIActivityTypeMessage, UIActivityTypeMail,
                                    UIActivityTypePrint, UIActivityTypeCopyToPasteboard,
                                    UIActivityTypeAssignToContact, UIActivityTypeSaveToCameraRoll,
                                    UIActivityTypeAddToReadingList, UIActivityTypePostToFlickr,
                                    UIActivityTypePostToVimeo, UIActivityTypePostToTencentWeibo];
    controller.excludedActivityTypes = excludedActivities;
    // Present the controller
    [self presentViewController:controller animated:YES completion:nil];


If you’re not forgetful, the above code should be very familiar to you as we’ve discussed it at the very beginning. The above code simply creates a UIActivityViewController, excludes all activities except AirDrop and presents the controller as a modal view. The tricky part is how you define the objects to share. Here we turn the file to share into a NSURL object and pass the file URL as an array to AirDrop.

The first two lines of code are responsible for the file URL conversion. The documentName property stores the current file (e.g. ios-game-kit-sample.pdf) displaying in document view. We simply call up the fileToURL: method with the document name and it returns the corresponding file URL. The fileToURL: method is bundled in the project template and here is the code:

– (NSURL *) fileToURL:(NSString*)filename
    NSArray *fileComponents = [filename componentsSeparatedByString:@”.”];
    NSString *filePath = [[NSBundle mainBundle] pathForResource:[fileComponents objectAtIndex:0] ofType:[fileComponents objectAtIndex:1]];

    return [NSURL fileURLWithPath:filePath];

The code is very straightforward. For example, the ios-game-kit-sample.pdf will be transformed to file:///Users/simon/Library/Application%20Support/iPhone%20Simulator/7.0.3/Applications/A5321493-318A-4A3B-8B37-E56B8B4405FC/ The file URL varies depending on the device you’re running. But the URL should begin with the “file://” protocol. With the file URL object, we create the corresponding array and pass it to UIActivityViewController for AirDrop sharing.

Build and Run the AirDrop Demo

You’re done. That’s what you need to implement AirDrop sharing. Compile and run the app on a real iPhone. Yes, you need a real device to test AirDrop sharing. The sharing feature won’t work on the Simulator.

AirDrop Demo App Share PDF

Uniform Type Identifiers (UTIs)

When you share an image to another iOS device, the receiving side automatically opens Photos app and loads the image. If you transfer a PDF file, the receiving device may prompt you to pick an app for opening the file or open it directly in iBooks. How can iOS know which app to use for the type of data?

UTIs (short for Uniform Type Identifiers) is Apple’s answer to identify data handled within the system. In brief, a uniform type identifier is a unique identifier for a particular type of data or file. For instance, com.adobe.pdf represents a PDF document and public.png represents a PNG image. You can find the full list of registered UTIs here. Application that is capable of opening a specific type of file has registered to handle that UTI with the iOS. So whenever that type of file is opened, iOS hands off that file to the specific app.

The system allows multiple apps to register the same UTI. In this case, iOS will prompt user with the list of capable apps for opening the file. For example, when you share a PDF document, you may experience the following screen in the receiving device:


Wrap Up

AirDrop is a very cool feature introduced in iOS 7. It offers a great way to share data between devices. Best of all, the built-in UIActivityViewController has made it easy for developers to add AirDrop support in their apps. As you can see from the demo app, it just needs a few lines of code to implement the feature. I highly recommend you to put this sharing feature in your app.

For your complete reference, you can download the full source code of the Xcode project from here.

As always, leave us comment and share your thought about the tutorial. We love to hear your feedback.

custom video camera with AVFoundation

AVFoundation is a very cool framework that allows you to collect multimedia data generated by different input sources (camera, microphone, etc.) and redirect them to any output destinations (screen, speakers, etc.). You can create custom playback and capture solutions for audio, video and still images. The advantage of using this framework with respect to the out-of-the-shelf solutions such as the MPMoviePlayerController or the UIImagePickerController is that you get access to the raw data of the camera. In this way, you can apply effects in real-time to the input signals for different purposes.

I have prepared for you a small app to show you how to use this framework and create a very cool video camera.


AVFoundation is based on the concept of the session. A session is used to control the flow of the data from an input to an output device. The creation of a session is really straightforward:

AVCaptureSession *session = [[AVCaptureSession alloc] init];

The session allows you to define the audio and video recording quality, using thesessionPreset property of the AVCaptureSession class. For this example, it’s fine to go for low quality data (so we save some battery cycle):

[session setSessionPreset:AVCaptureSessionPresetLow];

Capture Device

After the capture session has been created, you need to define the capture device you want to use. It can be the camera or the microphone. In this case, I am going to use theAVMediaTypeVideo type that supports videos and images:

AVCaptureDevice *inputDevice = 
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

Capture Device Input

Next step you need to define the input of the capture device and add it to the session. Here you go:

AVCaptureDeviceInput *deviceInput = 
[AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
if ( [session canAddInput:deviceInput] )
    [session addInput:deviceInput];

You check if you can add the device input to the session and if you can, you add it.


Before defining the device output, I want to show you how to preview the camera buffer. This will be the viewfinder of your camera, i.e. the preview of what the input device is seeing.

We can quickly render the raw data collected by the camera on the screen using theAVCaptureVideoPreviewLayer. We can create this preview layer using the session we defined above and then add it to our main view layer:

AVCaptureVideoPreviewLayer *previewLayer = 
[[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; 
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:
CGRectMake(-70, 0, rootLayer.bounds.size.height, rootLayer.bounds.size.height)];
[rootLayer insertSublayer:previewLayer atIndex:0];

You don’t need to do any additional work. You can now display the camera signal on your screen.

If you instead want to do some more cool stuffs, for example, if you want to process the camera signal to create nice video effects with Core Image or the Accelerate framework (give a look at this post), you need to collect the raw data generated by the camera, process them, and, if you like it, display them on the screen.

Go baby, go!!!

We are ready to go. The last thing you need to do is to start the session:

[session startRunning];

Cool stuffs

Since the AVCaptureVideoPreviewLayer is a layer, you can obviously add animations to it. I am attaching here a very simple Xcode project showing the previous concepts. It creates a custom video camera with the preview rotating in the 3D space.

Real-time processing

If you want to do some image processing with the raw data captured by the the camera and display the result on the screen, you need to collect those data, process them and render them on the screen without using the AVCaptureVideoPreviewLayer. Depending on what you want to achieve, you have two main strategies:

  1. Either you capture a still picture as soon as you need one; or
  2. You capture continuously the video buffer

Now, the first approach is the simplest one: whenever you need to know what the camera is looking at, you just shoot a picture. Instead, if you want to process the video buffer, that’s more tricky, especially if your image processing algorithm is slower than the camera framerate output. Here, you need to evaluate which solution is more suitable for you case. Take into account that depending on the device you can get different image resolution. For example, the iPhone 4s can provide images up to 8 mega pixels. Now, that’s a lot of data to process in real-time. So, if you are doing real-time image processing, you need to accept some lower quality images. But all these considerations are a topic for a next post.