Augmented Reality iOS Tutorial: Location Based

I think the altitude of the location shouldn’t be important, at least my telescope doesn’t ask me for the altitude, just lat and lon. Astronomical objects have a “altitude” in degrees above the horizon and this doesn’t change with the altitude of the users location.

Edit: And let me know if you have something to show. I’m always interested in astronomical apps.

In this case you can try to measure the pixels the marker was moved and calculate the new lat and lon, but for this calculation you must also consider the distance of the object. If it is 1m away moving 20 pixels means less than 20 pixels for a marker that is 20m away.

Could you solve your problem? A good starting point would be to add a breakpoint in ViewController.swift inside

func ar(_ arViewController: ARViewController, viewForAnnotation: ARAnnotation) -> ARAnnotationView

to check if this method is called and if a view returned.

Thanks for the reply . But I couldn’t wrap my head around what you said at the last. If you don’t mind can I ask you to explain ?

Did that, thanks, but now it brreaks here

ARViewController.swift:891:53: Use of instance member ‘createCaptureSession’ on type ‘ARViewController’; did you mean to use a value of type ‘ARViewController’ instead?

fileprivate func loadCamera()
{
    self.cameraLayer?.removeFromSuperlayer()
    self.cameraLayer = nil
    
 //===== Video device/video input
    **let captureSessionResult = ARViewController.createCaptureSession()**
    guard captureSessionResult.error == nil, let session = captureSessionResult.session else
    {
        print("HDAugmentedReality: Cannot create capture session, use createCaptureSession method to check if device is capable for augmented reality.")
        return
    }

Change the method declaration to
class func createCaptureSession() → (session: AVCaptureSession?, error: NSError?) {

Thx for your reply.
The method is called multiple times as there are more poi’s to show.
But

I overlooked a message:
[LogMessageLogging] 6.1 Unable to retrieve CarrierName. CTError: domain-2, code-5, errStr:((os/kern) failure)

I googled and found using MapKit could be a bug/problem since update to iOS 10.1x, Xcode 8.x and Swift 3.

So, I think I have to wait for a bug fix update
 :frowning:

Sure, a camera has a field of view, looking like a pyramid with the top at the lens and the bottom at the horizon. All parts of this pyramid, the small at the top and the big at the bottom have the same size on the display. Thats why moving something a pixel doesn’t mean the same distance.

Here are two images of my door, the first was taken at a distance of about 2m and the second 5m. The red line is about 250 px in both images. As you can see in the first image the line corresponds to 10cm and in the second it is about 30cm.


But i think you will have another problem. You want to move an object on a 2 dimensional screen and reflect this into the 3D world. It would be hard to change lat and lon at the same time.

I don’t think that this the problem. The AR part is independent from MapKit.

Inside ARViewController.swift is a method

fileprivate func positionAnnotationViews()

Try to print the calculated frames of the AnnotationView i think that there is something wrong, width/height or the x/y position is not on the screen. Maybe also something with the z value. You can also add a breakpoint there and check if you can see the view with quick look.

And can you check the maxVisibleAnnotations property of ARViewController and make sure that it is not 0.

The positionAnnotationViews() x and y are very high:
2017-01-31 11:44:03.343332 Places[1022:192238] x: 3586.61134674879, y: 369.2
2017-01-31 11:44:03.343703 Places[1022:192238] x: 4007.66178695483, y: 369.2
2017-01-31 11:44:03.344009 Places[1022:192238] x: 1776.62277009549, y: 369.2
2017-01-31 11:44:03.345160 Places[1022:192238] x: 2127.61554741798, y: 369.2
2017-01-31 11:44:03.345569 Places[1022:192238] x: 1219.30886436162, y: 369.2
2017-01-31 11:44:03.345679 Places[1022:192238] x: 3482.13300368835, y: 315.2
2017-01-31 11:44:03.345774 Places[1022:192238] x: 2477.29868783508, y: 369.2
and outside my screen.
I tested to divide x and y by 100 and get 1 poi stuck at the x value, moving at the y axe.
Strange uh?

Thanks for that. That fixed it.

Opps. nope 
 compiles with no errors - but crashes spectactuarly now
 iPhone SE

2017-01-31 15:18:58.298563 Places[1111:462902] [LogMessageLogging] 6.1 Unable to retrieve CarrierName. CTError: domain-2, code-5, errStr:((os/kern) failure)
Load pois
fatal error: unexpectedly found nil while unwrapping an Optional value
2017-01-31 15:34:38.763571 Places[1111:462845] fatal error: unexpectedly found nil while unwrapping an Optional value

Thank you for explaining me the concept , thank you for your time.

Hi. Can you please reinstate the original Objective-C version of this tutorial. I haven’t managed to get Ray to respond.

I applaud you for updating the tutorials, but wholesale changing it from one language to another and then removing the original really hasn’t helped. Surely just pasting a link that doesn’t redirect will be an easy fix? Or provide the download for the finished Obj-C source in the current tutorial?

Thanks.

bro here is source code of obj-с version of this tutorial. No thanks. link I don’t know how it works, but if you have trouble with it, send me massage on morphioofcordis[at]gmail.com

1.) Thank you kindly for the updated tutorial I look forward to following through it. Do you know how precise the distance can be? My hope is that it can be accurate down to at least 1-2 meters or several feet.

2.) Does anyone know of an alternative free solution for this same type of app on Android? I remember researching this some time ago and came up mostly empty.

3.) For those asking for the original Objective C tutorial link you can find it here.

Hi,
Nice tutorial, but i want this in objective c. The last tutorial output not same as in swift tutorial. The placement of poi is properly aligned in swift tutorial. But in objective c the poi is overlapping each other, please provide with latest objective c tutorial. It will be very helpful for me. Please help me.

Thanks in advance

Hello, Thanks for the awesome tutorial. I have integrated this successfully and it works fine. I want to add a feature into my app when user tap an annotation view on the camera view I want to display the path to it from user’s current location. I thought to do it by loading an arrow image on the viewcontroller and changing its orientation with the direction changes when user walk. Is this possible to do with this. Can you guide me how can I start adding this feature into my app.

Hi Bro,

The objective c project not as same as swift tutorial output. The overlay of location in camera layer is overlapping each other. But the swift tutorial output is very effective and good. Can anyone please provide the altered tutorial in objective c.
Thats for the swift tutorial for best location based augmented reality app.

thanks in advance

Hi nex,

if you use kCLLocationAccuracyBestForNavigation for desired accuracy it should be possible to get this accuracy, but this depends on the GPS signal and i guess that you won’t get this accuracy often. But just try it and check the horizontalAccuracy property of CLLocation everytime you receive an updated location. This tells you how accurate the position is. A value of 4 means that the real position is inside a circle with a radius of 4 meters.

@rdias I didn’t use navigation features in my apps but this could be a good starting point.

@natarajs If you think that the Swift output fits your needs better that the Obj-C version you can integrate the lib in your Obj-C code.