Loopy Pro: Create music, your way.

What is Loopy Pro?Loopy Pro is a powerful, flexible, and intuitive live looper, sampler, clip launcher and DAW for iPhone and iPad. At its core, it allows you to record and layer sounds in real-time to create complex musical arrangements. But it doesn’t stop there—Loopy Pro offers advanced tools to customize your workflow, build dynamic performance setups, and create a seamless connection between instruments, effects, and external gear.

Use it for live looping, sequencing, arranging, mixing, and much more. Whether you're a live performer, a producer, or just experimenting with sound, Loopy Pro helps you take control of your creative process.

Download on the App Store

Loopy Pro is your all-in-one musical toolkit. Try it for free today.

Need help creating an AUv3 plugin.

edited February 2 in App Development

I need help creating an AUv3 plugin based on a WebView.

The plugin works perfectly, but I can’t retrieve the playhead position from the host application, whether it’s AUM, Loopy Pro, or Logic.

Here is my code:

import CoreAudioKit
import WebKit
import AVFoundation

public class AudioUnitViewController: AUViewController, AUAudioUnitFactory {
    var audioUnit: AUAudioUnit?
    @IBOutlet weak var webView: WKWebView!
    var transportTimer: Timer?

    public override func viewDidLoad() {
        super.viewDidLoad()
        print("Event: viewDidLoad")
        let myProjectBundle: Bundle = Bundle.main
        if let myUrl = myProjectBundle.url(forResource: "view/index", withExtension: "html") {
            webView.loadFileURL(myUrl, allowingReadAccessTo: myUrl)
        } else {
            print("Error: Could not find index.html in bundle.")
        }
    }

    public override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        print("Event: viewDidAppear")
        logHostProperties()
        startTransportTimer()
    }

    public override func viewDidDisappear(_ animated: Bool) {
        super.viewDidDisappear(animated)
        print("Event: viewDidDisappear")
        stopTransportTimer()
    }

    public func createAudioUnit(with componentDescription: AudioComponentDescription) throws -> AUAudioUnit {
        print("Event: createAudioUnit")
        audioUnit = try webviewauv3AudioUnit(componentDescription: componentDescription, options: [])
        // Allocate render resources to enable transport state retrieval
        try audioUnit?.allocateRenderResources()
        return audioUnit!
    }

    func startTransportTimer() {
        transportTimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(checkHostTransport), userInfo: nil, repeats: true)
    }

    func stopTransportTimer() {
        transportTimer?.invalidate()
        transportTimer = nil
    }

    func logHostProperties() {
        guard let au = audioUnit else {
            print("Error: audioUnit is nil")
            return
        }
        if au.outputBusses.count > 0 {
            let format = au.outputBusses[0].format
            print("Host Event: Sample Rate: \(format.sampleRate)")
        } else {
            print("Error reading sample rate")
        }
    }

    @objc func checkHostTransport() {
        guard let au = audioUnit else {
            print("Error: audioUnit is nil")
            return
        }
        if let transportStateBlock = au.transportStateBlock {
            var transportStateChanged: AUHostTransportStateFlags = AUHostTransportStateFlags(rawValue: 0)
            var currentSampleTime: Double = 0
            var currentBeat: Double = 0
            var currentTempo: Double = 0
            let success = transportStateBlock(&transportStateChanged,
                                              &currentSampleTime,
                                              &currentBeat,
                                              &currentTempo)
            if success {
                DispatchQueue.main.async {
                    print("Host Transport State Changed: \(transportStateChanged)")
                    print("Current Sample Time: \(currentSampleTime)")
                    print("Current Beat Position: \(currentBeat)")
                    print("Current Tempo: \(currentTempo)")
                }
            } else {
                print("Failed to retrieve host transport state.")
            }
        } else {
            print("Transport state block is not available.")
        }
    }
}

it’s very simple. However, I keep getting “Failed to retrieve host transport state.”

If anyone can help or has any leads, I’d really appreciate it.

Thank you very much!

Comments

  • Tagging @brambos and @Michael to see if they can be of assistance. Best of luck. 🤞

  • edited February 2

    You could try calling it at the beginning of your render block, and see if it works that way, on-demand calling of the block may very well be unsupported.
    Based on this comment in AUAudioUnit.h:

    If the host app provides this block to an AUAudioUnit (as its transportStateBlock), then
    the block may be called at the beginning of each render cycle to obtain information about
    the current transport state.

  • @zoltan said:
    You could try calling it at the beginning of your render block, and see if it works that way. Based on this comment in AUAudioUnit.h:

    If the host app provides this block to an AUAudioUnit (as its transportStateBlock), then
    the block may be called at the beginning of each render cycle to obtain information about
    the current transport state.

    Thanks.
    I try this right now

  • @zoltan said:
    You could try calling it at the beginning of your render block, and see if it works that way, on-demand calling of the block may very well be unsupported.
    Based on this comment in AUAudioUnit.h:

    If the host app provides this block to an AUAudioUnit (as its transportStateBlock), then
    the block may be called at the beginning of each render cycle to obtain information about
    the current transport state.

    Thank you very much, I can now retrieve the host application’s timecode.

  • @Jeezs said:

    @zoltan said:
    You could try calling it at the beginning of your render block, and see if it works that way, on-demand calling of the block may very well be unsupported.
    Based on this comment in AUAudioUnit.h:

    If the host app provides this block to an AUAudioUnit (as its transportStateBlock), then
    the block may be called at the beginning of each render cycle to obtain information about
    the current transport state.

    Thank you very much, I can now retrieve the host application’s timecode.

    Great, cheers!

  • @_ki or @wim c might be able to help…

    There are some app developers in here as well.

  • wimwim
    edited February 2

    ha! not me. I tried and failed to learn to make AUv3 plugins. Hacking together a few Mozaic scripts is as far as I go as a "developer". 😂

  • _ki_ki
    edited February 3

    @Jeezs I also haven‘t coded any AUv3 or done anything in Swift. Long, long time ago i did some app development with Xcode in Objective-C in the AR topic, but that predates IAA or AUv3 :)

    If i remember correctly, several devs published their sources (or parts of it) like Midi Tape Recorder from Geert Bevin or MIDISequencerAUv3 from Cem Olka. Both are AUv3 midi plugins so they might provide some answers on how to handle stuff.

Sign In or Register to comment.