[RESOLVED]Adding image tracking to the iOS example?

I am trying to add image tracking to the iOS client example and while I have no issues detecting images the coordinates reported are very high/wrong. Even when I copy code directly from another project that works fine I still get the odd coordinates. (instead of values that seems reasonable like a 0.5 meters etc I get values in the 100-1000 range.) Since the code is exactly the same and taken from a working metal code project that draws a marker where the anchor is I have become quite stuck. Has anyone else encountered this issue? Is there something that should be configured in another way?

I just add ref images to the config and then call a function in “didUpdateFrame”
(this is cut down/simplified from the code I intend to use that would send this data as generic input events)

- (void)updateAnchorsWithFrame:(ARFrame *)frame
{
    NSInteger anchorInstanceCount = MIN(frame.anchors.count, kMaxAnchorInstanceCount);
    
    NSInteger anchorOffset = 0;
    if (anchorInstanceCount == kMaxAnchorInstanceCount) {
        anchorOffset = MAX(frame.anchors.count - kMaxAnchorInstanceCount, 0);
    }
    NSLog(@"Num anchors: %d", (int)anchorInstanceCount);
    for (NSInteger index = 0; index < anchorInstanceCount; index++) {
        ARAnchor *anchor = frame.anchors[index + anchorOffset];
        
        float x = anchor.transform.columns[3].x;
        float y = anchor.transform.columns[3].y;
        float z = anchor.transform.columns[3].z;
        NSLog(@"Orig Image pos: %f %f %f\n", x, y ,z );
    }
}

Resolved: for some reason the config of the marker I use had changed units when I copied it from another project. Assumption is the mother of all f*ckups… Now onto making sure the data sent to the server is in the correct coordinate system