Questions tagged [avfoundation]

0

votes
0

answer
4

Views

AudioUnitInitialize causes Allow Microphone access prompt on iOS Simulator

I'm using AudioUnit to playback audio from a TeamSpeak server but when I call AudioUnitInitialize on the iOS Simulator, I'm getting constantly the macOS prompt to allow microphone access even if I want to playback only. On a real device everything works fine without any native prompts but it is real...
Jan
0

votes
0

answer
5

Views

AVPlayer Lock Screen updates stop working, only when another AVPlayer instance is played in a subsequent view controller

I have an appliction with View Controllers A and B. View controller A is the presenter of view controller B (pushes it using a navigation controller). Both A and B have a seperate AVPlayer inside them (A's plays audio. B's plays video. Both local files from docs directory). VC A configure...
FranticRock
1

votes
1

answer
320

Views

__availableRawPhotoPixelFormatTypes is empty on iPhone 7+ and iOS11

I'm trying to capture RAW files with AVFoundation. However I'm getting empty array in __availableRawPhotoPixelFormatTypes Here is my snippet if self._photoOutput == nil { self._photoOutput = AVCapturePhotoOutput() print(self._photoOutput!.__availableRawPhotoPixelFormatTypes) } And the output is empt...
s1ddok
1

votes
1

answer
1.3k

Views

Processing subtitles while video is playing in iOS

I need to do some text processing on the subtitle of a local (or remote) video file as they appear on the screen. Currently I am using AVPlayerItem and by looking into its assets I can see that there is a closed-caption track, but I cannot get the actual subtitle text. So the question is how can I g...
Ehsan
1

votes
0

answer
11

Views

Play Audio from Local File

I want to play a downloaded local audio file but it doesn't play: class AVPlayerService { static let instance = AVPlayerService() private var audioPlayer: AVPlayer! public weak var delegate: AVPlayerServiceDelegate? func setupPlayer(forURL url: URL) { let playerItem: AVPlayerItem = AVPlayerItem(url:...
SwiftNewling
1

votes
1

answer
186

Views

AVAssetWriterInput: how to get the right aspect ratio of the images within the movie?

I try to create a video based on a collection images. The size of the images is hold in inputSize. The size of the video is hold in outputSize. I'd like the images aspect fit with full height in the video. But for some reasons I can't explain, the final video shows full height image (good) but a st...
Stéphane de Luca
1

votes
1

answer
659

Views

AVPlayerViewController not loading remote URL

I'm trying to play a remote MP4 video in an AVPlayerViewController: self.vcVideo = [[AVPlayerViewController alloc] init]; self.vcVideo.player = [AVPlayer playerWithURL:[NSURL URLWithString:videoURL]]; [self presentViewController:self.vcVideo animated:YES completion:^{ [self.vcVideo.player play]; }];...
scttnlsn
1

votes
1

answer
132

Views

Turning retina images into video: “CGContextScaleCTM: invalid context 0x0. This is a serious error.”

The goal is to turn an array of retina images into a single video. The two functions below are responsible for writing an UIImage into a AVAssetWriterInputPixelBufferAdaptor. They work, and the code produces a seemingly valid video out of the images. However, Xcode complains with this error: CGConte...
Crashalot
1

votes
1

answer
296

Views

How can I tell if the camera is in use by another process?

In OS X, how can I tell if the camera or microphone is in use by another application or process? The following doesn't seem to work unless the other application has locked the device. NSArray *devices = [AVCaptureDevice devices]; for (AVCaptureDevice *device in devices) { NSLog(@'In use by other app...
TerryB
1

votes
2

answer
1.5k

Views

Crash in AVCaptureSession when adding an AVCaptureDeviceInput

I have a weird crash showing on Crashlytics when setting up a camera session. The stacktrace shows that the crash occurred at the method addInput. func setupCamSession(){ self.captureSession = AVCaptureSession() self.cameraView.setSession(self.captureSession) self.sessionQueue = dispatch_queue_cre...
ELKA
1

votes
1

answer
634

Views

RTSP and iOS: How to tackle this? [closed]

My primary goal is to play an (s)RTP stream on iOS. Often, the suggestion is to use FFMPEG, but given that is LGPL I have my doubts about the compatibility of its license and the App Store. As I know my media is H.264 my idea was to look deeper into RTSP and RTP, and 'simply' decode the payload to f...
below
1

votes
1

answer
161

Views

AVAudioSessionCategoryAudioProcessing: is the lack of interruption notifications guaranteed?

If I set my AVAudioSession's category to AVAudioSessionCategoryAudioProcessing, I do not receive any interruption notifications regardless of what else I'm doing. (Locking the device, playing audio in another app, etc.) I understand why this is — AVAudioSessionCategoryAudioProcessing is for offlin...
Archagon
1

votes
2

answer
129

Views

Image output from AVFoundation occupied only 1/4 of screen

I played around with AVFoundation try to apply filter to live video. I tried to apply filter to AVCaptureVideoDataOutput, but the output occupied only 1/4 of the view. Here are some of my related code Capturing let availableCameraDevices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) for d...
sarunw
1

votes
2

answer
1.6k

Views

Accessing multiple audio hardware outputs/channels using AVFoundation and Swift

How do I access additional audio hardware outputs other than 1-2 using AVFoundation? I'm writing swift code for a Mac OS-X app which plays mp3 files through various output devices (USB interface, dante, soundflower) which looks like the following: myPlayer = AVPlayer(URL: myFilePathURL) myPlayer.au...
Michael Sweet
1

votes
1

answer
127

Views

How to pre-load a camera in a PageViewController so that there is no loading animation?

My app uses three view controllers in a Page View Controller. The user swipes left to access a camera. When the user navigates to the camera view, there is a clunky default camera loading animation. I would like the camera to be pre-loaded so that the user sees a fully active camera the moment they...
1

votes
1

answer
97

Views

AVFoundation classes not resolved in OSX xcode project

I started a new Xcode project as a command line / OSX / Swift application. I import AVFoundation and add the following code to top level scope after also importing foundation. let string = 'Hello, World!' let utterance = AVSpeechUtterance(string: string) utterance.voice = AVSpeechSynthesisVoice(lan...
Alex Bollbach
1

votes
1

answer
160

Views

Render different resolutions in AVMutableComposition

When using AVMutableComposition to add a CATextLayer on top of a video, is there a way to render the text layer in a higher resolution than that of the video? The video is currently 1024x576, and I cannot afford to increase the bitrate or file size more than that, but adding a text layer in this ren...
Daniel Larsson
1

votes
1

answer
654

Views

AVAudioConverter.convertToBuffer throwing error code -50

I need to convert a .wav file recorded with 2 audio channels to a .wav that has only 1 channel, as well as reduce the bit depth from 32 to 16. I've been trying to use AVAudioConverter.convertToBuffer However, the conversion is throwing an error: Error Domain=NSOSStatusErrorDomain Code=-50 '(null)' B...
Josh Buhler
1

votes
1

answer
296

Views

How To Keep Two AVCaptureMovieFileOutput In Sync

I have two camera feeds coming into an OSX app and I am trying to save them using AVCaptureMovieFileOutput. It doesn't take long before they videos are out of sync. After a min test they can be off by 1 to 5 seconds. After an hour test they are off by 20s. I feel there must be some sort of simple so...
Skyler Lauren
1

votes
1

answer
876

Views

Acessing Video Frames in Real Time iOS

I have to access video from a server and I want to be able to acess its frames in real time so that I can do some image processing on them. I know how to do it offline using the AVAssetReader but this solution does not work for real time processing.
user2232305
1

votes
1

answer
176

Views

Ways to do inter-frame video compression in AVFoundation

I've created a process to generate video 'slideshows' from collections of photographs and images in an application that I'm building. The process is functioning correctly, but creates unnecessarily large files given that any photographs included in the video repeat for 100 to 150 frames unchanged....
Ron Diel
1

votes
1

answer
93

Views

Recording locations in time domain with AVFoundation

I followed a lot of tutorials and every tutorials about recording in AVFoundation covers only recording Video or Audio or both of this things. I would like to record some location in the same time domain like Video/Audio on separate track. This location waypoints is described with 5 properties only...
Marcin Kapusta
1

votes
2

answer
342

Views

Swift 3: Get length of AVAudioPCMBuffer as TimeInterval?

All I'm trying to do is what the title says, but I can't seem to do it. I tried doing the following: func getLength(buffer: AVAudioPCMBuffer) -> TimeInterval { let framecount = buffer.frameCapacity let bytesperframe = buffer.format.streamDescription.pointee.mBytesPerFrame return TimeInterval(bytespe...
MysteryPancake
1

votes
1

answer
329

Views

AVCaptureSession recording audio only after 1 to 2 seconds delay

I am creating an Audio Video recording application for MAC OSX using Objective-C using AVfoundation class, the video starts to record at button click. There is an delay in audio record starting time approx 1 to 2 secs. Here is the code i used to setup devices, deviceInput = [AVCaptureDeviceInput de...
Jeba Moses
1

votes
1

answer
173

Views

Why CVPixelBufferGetDataSize always return 32-byte more data?

I am using three classes to screen capture on OSX AVCaptureSession *m_captureSession; // AVCaptureSession AVCaptureScreenInput *m_screenInput; //AVCaptureScreenInput AVCaptureVideoDataOutput *m_videoOutput; // AVCaptureVideoDataOutput And AVCaptureVideoDataOutputSampleBufferDelegate a...
deepsky
1

votes
1

answer
42

Views

Stopping Music in SpriteKit when leaving a GameScene

I have implemented background music in my game using AVFoundation. However, when leaving the GameScene the music continues playing. I know that in viewControllers you can use viewWillDisappear but I can't find anything similar for GameScenes. var audioPlayer = AVAudioPlayer() let audioPath = Bundle....
Lampros Tzanetos
1

votes
1

answer
156

Views

AudioToolBox Recorder gets affect by AVFoundation AudioPlayer

So, I have the following classes: audio recorder (using AudioToolbox and CoreAudio) that records audio. audio player (using AVFoundation) The recorder captures audio, sends it to a server who then replies with another audio, then the player plays the received audio. When I tried to call the recorder...
LuAndre
1

votes
1

answer
105

Views

MP3 File Not Playing On Button Select

For simplicity, assume this View says alphabet letters in button when you click on said button. I have been attempting to play the audio files associated with each button, but the files never play. The buttons, individually, are never given outlets, but they are all connected to the function on Touc...
MoMN
1

votes
1

answer
267

Views

How can I get the Sample Rate & Bits Per Channel on a CMAudioFormatDescription

I'm able to get the audio description from AVCaptureDeviceFormat. let formats = device.formats for format in formats { print(format.formatDescription) } But would like to get directly to the mSampleRate and mBitsPerChannel property. CMAudioFormatDescription 0x60000010b880 [0x7fffadb29d80] { mediaTyp...
Sammy The Hand
1

votes
2

answer
217

Views

Can't understand the lag in my spritekit game?

Making a simple spritekit game. There I have sounds like button touched sound and etc. did by AVFoundation sound library. But have trouble with this. My frame rate always drops when hit any button, which sounds. But if I mute all my sound, there is no lag. Here is my code: import AVFoundation class...
Nurassyl Nuridin
1

votes
2

answer
909

Views

save a rotated picture (swift3)

picture without rotation picture with rotation All i want to do is take photos saved with the rotation. Exactly like the picture with rotation using a custom camera built with avfoundation. Right now my code takes photos and saves them to the photo gallery just like the picture without rotation. im...
1

votes
1

answer
1.6k

Views

How do I add text to an image in Swift 3?

I have a basic camera app. When I take a picture, that image is then passed into the controller below. I want to be able to add text to the passed in image as well as drag it around the view, scale it etc. but I can't even get past actually adding text onto the image. I have tried a bunch of modern...
Daniel Dramond
1

votes
1

answer
418

Views

AVPlayerItemVideoOutput copyPixelBufferForItemTime gives incorrect CVPixelBufferRef on iOS for particular video

Have you met the problem when for the same video copyPixelBufferForItemTime is incorrect on iOS? I have AVPlayerItemVideoOutput, linked to appropriate AVPlayerItem. I call copyPixelBufferForItemTime, receive CVPixelBufferRef and then retrieve OpenGL texture from it. CVPixelBufferRef pb = [_playerVi...
Olha Pavliuk
1

votes
1

answer
828

Views

AVCaptureVideoDataOutput videoSettings with kCVPixelBufferOpenGLESCompatibilityKey

I am trying to get openGL compatible buffers from camera. I am setting up AVCaptureVideoDataOutput like this: _videoDataOutput = [[AVCaptureVideoDataOutput alloc] init]; [_videoDataOutput setSampleBufferDelegate:self queue:_sessionQueue]; // NSArray *pixelsFormats = _videoDataOutput.availableVideoC...
user1150393
1

votes
1

answer
551

Views

Swift: AVAudioPCMBuffer vs AVAudioBuffer vs AVAudioCompressedBuffer

I'm currently using an AVAudioPCMBuffer to play a .wav file in a simple game. However, when checking the documentation for it, I came across two other types of buffers I never saw anywhere else, the ones in the title: AVAudioBuffer and AVAudioCompressedBuffer. My question is, which one should I be u...
MysteryPancake
1

votes
1

answer
437

Views

Swift: Rounded Corners on Camera View

I want to implement rounded corners on my RecordingViewController(which has the camera view).I tried going into the app delegate and changing the corner radius of the window however, it seemed to not work. I believe the issue is with this method (AVLayerVideoGravityResizeAspectFill), it seems to fil...
1

votes
1

answer
152

Views

Cannot record using Bluetooth speaker on iPad + Swift 3

I want to record with Bluetooth Speaker connected to iPad. However, when using AVAudioRecorder, it turns into built-in speaker. let session = AVAudioSession.sharedInstance() try! session.setCategory(AVAudioSessionCategoryPlayAndRecord) try! session.setActive(true)         let recordSetting: [Str...
Patyo
1

votes
2

answer
632

Views

Change AVSpeechUtterance rate in real time

I am currently developing an iOS app that converts text to speech using AVSynthesizer. What I want to do is that while the synthesizer is speaking, utterance rate can be changed and with a slider and the speed of the speaking changes. I am doing this in the IBAction of the slider: self.utterance = s...
abnerabbey
1

votes
1

answer
335

Views

AVCaptureDevice videoZoomFactor Pinch-to-Zoom Rate

I am trying to implement 'Pinch-to-Zoom' feature with AVCaptureDevice in AVFoundation: @IBAction func pinchGestureDetected(_ gestureRecognizer: UIPinchGestureRecognizer) { switch gestureRecognizer.state { case .began: print ('began') self.currenZoomFactor = self.videoDevice!.videoZoomFactor do { tr...
Gizmodo
1

votes
1

answer
31

Views

How to track the number of loops in AVAudioFoundation

I am making an audio player in my iOS project. I setup the play(audioOfUrl:URL, for times:Int)method by passing the name of the url and how many times the audio file will be play as following: func play(audioOfUrl:URL, for times:Int) { let urlPath = audioOfUrl loopsLeftOver = times do { let audio =...
Nan

View additional questions