I think the altitude of the location shouldnât be important, at least my telescope doesnât ask me for the altitude, just lat and lon. Astronomical objects have a âaltitudeâ in degrees above the horizon and this doesnât change with the altitude of the users location.
Edit: And let me know if you have something to show. Iâm always interested in astronomical apps.
In this case you can try to measure the pixels the marker was moved and calculate the new lat and lon, but for this calculation you must also consider the distance of the object. If it is 1m away moving 20 pixels means less than 20 pixels for a marker that is 20m away.
ARViewController.swift:891:53: Use of instance member âcreateCaptureSessionâ on type âARViewControllerâ; did you mean to use a value of type âARViewControllerâ instead?
fileprivate func loadCamera()
{
self.cameraLayer?.removeFromSuperlayer()
self.cameraLayer = nil
//===== Video device/video input
**let captureSessionResult = ARViewController.createCaptureSession()**
guard captureSessionResult.error == nil, let session = captureSessionResult.session else
{
print("HDAugmentedReality: Cannot create capture session, use createCaptureSession method to check if device is capable for augmented reality.")
return
}
Thx for your reply.
The method is called multiple times as there are more poiâs to show.
ButâŠ
I overlooked a message:
[LogMessageLogging] 6.1 Unable to retrieve CarrierName. CTError: domain-2, code-5, errStr:((os/kern) failure)
I googled and found using MapKit could be a bug/problem since update to iOS 10.1x, Xcode 8.x and Swift 3.
So, I think I have to wait for a bug fix updateâŠ
Sure, a camera has a field of view, looking like a pyramid with the top at the lens and the bottom at the horizon. All parts of this pyramid, the small at the top and the big at the bottom have the same size on the display. Thats why moving something a pixel doesnât mean the same distance.
Here are two images of my door, the first was taken at a distance of about 2m and the second 5m. The red line is about 250 px in both images. As you can see in the first image the line corresponds to 10cm and in the second it is about 30cm.
But i think you will have another problem. You want to move an object on a 2 dimensional screen and reflect this into the 3D world. It would be hard to change lat and lon at the same time.
I donât think that this the problem. The AR part is independent from MapKit.
Inside ARViewController.swift is a method
fileprivate func positionAnnotationViews()
Try to print the calculated frames of the AnnotationView i think that there is something wrong, width/height or the x/y position is not on the screen. Maybe also something with the z value. You can also add a breakpoint there and check if you can see the view with quick look.
And can you check the maxVisibleAnnotations property of ARViewController and make sure that it is not 0.
The positionAnnotationViews() x and y are very high:
2017-01-31 11:44:03.343332 Places[1022:192238] x: 3586.61134674879, y: 369.2
2017-01-31 11:44:03.343703 Places[1022:192238] x: 4007.66178695483, y: 369.2
2017-01-31 11:44:03.344009 Places[1022:192238] x: 1776.62277009549, y: 369.2
2017-01-31 11:44:03.345160 Places[1022:192238] x: 2127.61554741798, y: 369.2
2017-01-31 11:44:03.345569 Places[1022:192238] x: 1219.30886436162, y: 369.2
2017-01-31 11:44:03.345679 Places[1022:192238] x: 3482.13300368835, y: 315.2
2017-01-31 11:44:03.345774 Places[1022:192238] x: 2477.29868783508, y: 369.2
and outside my screen.
I tested to divide x and y by 100 and get 1 poi stuck at the x value, moving at the y axe.
Strange uh?
Opps. nope ⊠compiles with no errors - but crashes spectactuarly now⊠iPhone SE
2017-01-31 15:18:58.298563 Places[1111:462902] [LogMessageLogging] 6.1 Unable to retrieve CarrierName. CTError: domain-2, code-5, errStr:((os/kern) failure)
Load pois
fatal error: unexpectedly found nil while unwrapping an Optional value
2017-01-31 15:34:38.763571 Places[1111:462845] fatal error: unexpectedly found nil while unwrapping an Optional value
Hi. Can you please reinstate the original Objective-C version of this tutorial. I havenât managed to get Ray to respond.
I applaud you for updating the tutorials, but wholesale changing it from one language to another and then removing the original really hasnât helped. Surely just pasting a link that doesnât redirect will be an easy fix? Or provide the download for the finished Obj-C source in the current tutorial?
bro here is source code of obj-Ń version of this tutorial. No thanks. link I donât know how it works, but if you have trouble with it, send me massage on morphioofcordis[at]gmail.com
1.) Thank you kindly for the updated tutorial I look forward to following through it. Do you know how precise the distance can be? My hope is that it can be accurate down to at least 1-2 meters or several feet.
2.) Does anyone know of an alternative free solution for this same type of app on Android? I remember researching this some time ago and came up mostly empty.
3.) For those asking for the original Objective C tutorial link you can find it here.
Hi,
Nice tutorial, but i want this in objective c. The last tutorial output not same as in swift tutorial. The placement of poi is properly aligned in swift tutorial. But in objective c the poi is overlapping each other, please provide with latest objective c tutorial. It will be very helpful for me. Please help me.
Hello, Thanks for the awesome tutorial. I have integrated this successfully and it works fine. I want to add a feature into my app when user tap an annotation view on the camera view I want to display the path to it from userâs current location. I thought to do it by loading an arrow image on the viewcontroller and changing its orientation with the direction changes when user walk. Is this possible to do with this. Can you guide me how can I start adding this feature into my app.
The objective c project not as same as swift tutorial output. The overlay of location in camera layer is overlapping each other. But the swift tutorial output is very effective and good. Can anyone please provide the altered tutorial in objective c.
Thats for the swift tutorial for best location based augmented reality app.
if you use kCLLocationAccuracyBestForNavigation for desired accuracy it should be possible to get this accuracy, but this depends on the GPS signal and i guess that you wonât get this accuracy often. But just try it and check the horizontalAccuracy property of CLLocation everytime you receive an updated location. This tells you how accurate the position is. A value of 4 means that the real position is inside a circle with a radius of 4 meters.