<i id='P3ogV'><tr id='P3ogV'><dt id='P3ogV'><q id='P3ogV'><span id='P3ogV'><b id='P3ogV'><form id='P3ogV'><ins id='P3ogV'></ins><ul id='P3ogV'></ul><sub id='P3ogV'></sub></form><legend id='P3ogV'></legend><bdo id='P3ogV'><pre id='P3ogV'><center id='P3ogV'></center></pre></bdo></b><th id='P3ogV'></th></span></q></dt></tr></i><div id='P3ogV'><tfoot id='P3ogV'></tfoot><dl id='P3ogV'><fieldset id='P3ogV'></fieldset></dl></div>
    <legend id='P3ogV'><style id='P3ogV'><dir id='P3ogV'><q id='P3ogV'></q></dir></style></legend>
        <bdo id='P3ogV'></bdo><ul id='P3ogV'></ul>

      <small id='P3ogV'></small><noframes id='P3ogV'>

      <tfoot id='P3ogV'></tfoot>
      1. 未调用 didOutputSampleBuffer 委托

        时间:2023-09-12
        <tfoot id='cmLoe'></tfoot>
              • <bdo id='cmLoe'></bdo><ul id='cmLoe'></ul>

                <small id='cmLoe'></small><noframes id='cmLoe'>

                <i id='cmLoe'><tr id='cmLoe'><dt id='cmLoe'><q id='cmLoe'><span id='cmLoe'><b id='cmLoe'><form id='cmLoe'><ins id='cmLoe'></ins><ul id='cmLoe'></ul><sub id='cmLoe'></sub></form><legend id='cmLoe'></legend><bdo id='cmLoe'><pre id='cmLoe'><center id='cmLoe'></center></pre></bdo></b><th id='cmLoe'></th></span></q></dt></tr></i><div id='cmLoe'><tfoot id='cmLoe'></tfoot><dl id='cmLoe'><fieldset id='cmLoe'></fieldset></dl></div>
                <legend id='cmLoe'><style id='cmLoe'><dir id='cmLoe'><q id='cmLoe'></q></dir></style></legend>

                    <tbody id='cmLoe'></tbody>
                  本文介绍了未调用 didOutputSampleBuffer 委托的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着跟版网的小编来一起学习吧!

                  问题描述

                  我的代码中的didOutputSampleBuffer 函数没有被调用.我不知道为什么会这样.代码如下:

                  didOutputSampleBuffer function in my code was not called. I don't know why it happened. Here's the code:

                  import UIKit
                  import AVFoundation
                  import Accelerate
                  
                  class ViewController: UIViewController {
                  
                  var captureSession: AVCaptureSession?
                  var dataOutput: AVCaptureVideoDataOutput?
                  var customPreviewLayer: AVCaptureVideoPreviewLayer?
                  
                  @IBOutlet weak var camView: UIView!
                  
                  override func viewWillAppear(animated: Bool) {
                      super.viewDidAppear(animated)
                      captureSession?.startRunning()
                      //setupCameraSession()
                  }
                  
                  override func viewDidLoad() {
                      super.viewDidLoad()
                      // Do any additional setup after loading the view, typically from a nib.
                      //captureSession?.startRunning()
                      setupCameraSession()
                  }
                  
                  override func didReceiveMemoryWarning() {
                      super.didReceiveMemoryWarning()
                      // Dispose of any resources that can be recreated.
                  }
                  
                  func setupCameraSession() {
                      // Session
                      self.captureSession = AVCaptureSession()
                      captureSession!.sessionPreset = AVCaptureSessionPreset1920x1080
                      // Capture device
                      let inputDevice: AVCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
                      var deviceInput = AVCaptureDeviceInput()
                  
                      do {
                          deviceInput = try AVCaptureDeviceInput(device: inputDevice)
                      } catch let error as NSError {
                          print(error)
                      }
                      if captureSession!.canAddInput(deviceInput) {
                          captureSession!.addInput(deviceInput)
                      }
                      // Preview
                  
                      self.customPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
                      self.customPreviewLayer!.frame = camView.bounds
                      self.customPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
                      self.customPreviewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.Portrait
                      camView.layer.addSublayer(self.customPreviewLayer!)
                      print("Cam layer added")
                  
                      self.dataOutput = AVCaptureVideoDataOutput()
                      self.dataOutput!.videoSettings = [
                          String(kCVPixelBufferPixelFormatTypeKey) : Int(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
                      ]
                  
                      dataOutput!.alwaysDiscardsLateVideoFrames = true
                      if captureSession!.canAddOutput(dataOutput) {
                          captureSession!.addOutput(dataOutput)
                      }
                      captureSession!.commitConfiguration()
                      let queue: dispatch_queue_t = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL)
                      let delegate = VideoDelegate()
                      dataOutput!.setSampleBufferDelegate(delegate, queue: queue)
                  }
                  
                  
                  
                  
                   func captureOutput(captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection) {
                      let imageBuffer: CVImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer)!
                      CVPixelBufferLockBaseAddress(imageBuffer, 0)
                      // For the iOS the luma is contained in full plane (8-bit)
                      let width: size_t = CVPixelBufferGetWidthOfPlane(imageBuffer, 0)
                      let height: size_t = CVPixelBufferGetHeightOfPlane(imageBuffer, 0)
                      let bytesPerRow: size_t = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 0)
                      let lumaBuffer: UnsafeMutablePointer = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0)
                      let grayColorSpace: CGColorSpaceRef = CGColorSpaceCreateDeviceGray()!
                      let context: CGContextRef = CGBitmapContextCreate(lumaBuffer, width, height, 8, bytesPerRow, grayColorSpace, CGImageAlphaInfo.NoneSkipFirst.rawValue)!
                      let dstImageFilter: CGImageRef = CGBitmapContextCreateImage(context)!
                      dispatch_sync(dispatch_get_main_queue(), {() -> Void in
                          self.customPreviewLayer!.contents = dstImageFilter as AnyObject
                      })
                  
                  }
                  
                  
                  }
                  

                  这是我的 VideoDelegate 代码:

                  And here is my VideoDelegate code:

                  import Foundation
                  import AVFoundation
                  import UIKit
                  
                  // Video Delegate
                  class VideoDelegate : NSObject, AVCaptureVideoDataOutputSampleBufferDelegate
                  {
                  
                      func captureOutput(captureOutput: AVCaptureOutput!,
                          didOutputSampleBuffer sampleBuffer: CMSampleBuffer!,
                          fromConnection connection: AVCaptureConnection!){
                              print("hihi")
                  
                      }
                  
                  
                      func captureOutput(captureOutput: AVCaptureOutput!,
                          didDropSampleBuffer sampleBuffer: CMSampleBuffer!,
                          fromConnection connection: AVCaptureConnection!){
                  
                              print("LOL")
                      }
                  
                  
                  }
                  

                  为什么我的委托没有被调用以及如何解决它?我已经检查了有关堆栈溢出的类似问题,但我找不到解决此问题的方法.请帮助.

                  Why does"t my delegate get called and how to fix it? I've checked similar question on stack overflow but but i can't find a method to solve this. Please help.

                  推荐答案

                  我发现了我报错的问题!这是因为被调用的委托必须在同一个视图控制器中创建.这是修改后的代码:

                  I found the problem of my error! It's because the delegate that was being called has to be created in the same view controller. here is the modified code:

                  import UIKit
                  import AVFoundation
                  import Accelerate
                  
                  var customPreviewLayer: AVCaptureVideoPreviewLayer?
                  
                  class ViewController: UIViewController,     AVCaptureVideoDataOutputSampleBufferDelegate {
                  
                  var captureSession: AVCaptureSession?
                  var dataOutput: AVCaptureVideoDataOutput?
                  //var customPreviewLayer: AVCaptureVideoPreviewLayer?
                  
                  @IBOutlet weak var camView: UIView!
                  
                  override func viewWillAppear(animated: Bool) {
                      super.viewDidAppear(animated)
                      //setupCameraSession()
                  }
                  
                  override func viewDidLoad() {
                      super.viewDidLoad()
                      // Do any additional setup after loading the view, typically from a nib.
                      //captureSession?.startRunning()
                      setupCameraSession()
                      self.captureSession?.startRunning()
                  }
                  
                  override func didReceiveMemoryWarning() {
                      super.didReceiveMemoryWarning()
                      // Dispose of any resources that can be recreated.
                  }
                  
                  func setupCameraSession() {
                      // Session
                      self.captureSession = AVCaptureSession()
                      self.captureSession!.sessionPreset = AVCaptureSessionPreset1920x1080
                      // Capture device
                      let inputDevice: AVCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
                      var deviceInput = AVCaptureDeviceInput()
                      // Device input
                      //var deviceInput: AVCaptureDeviceInput? = AVCaptureDeviceInput.deviceInputWithDevice(inputDevice, error: error)
                      do {
                          deviceInput = try AVCaptureDeviceInput(device: inputDevice)
                  
                      } catch let error as NSError {
                          // Handle errors
                          print(error)
                      }
                      if self.captureSession!.canAddInput(deviceInput) {
                          self.captureSession!.addInput(deviceInput)
                      }
                      // Preview
                      customPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
                      customPreviewLayer!.frame = camView.bounds
                      customPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
                      customPreviewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.Portrait
                      self.camView.layer.addSublayer(customPreviewLayer!)
                      print("Cam layer added")
                  
                      self.dataOutput = AVCaptureVideoDataOutput()
                      self.dataOutput!.videoSettings = [
                          String(kCVPixelBufferPixelFormatTypeKey) : Int(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
                      ]
                  
                      self.dataOutput!.alwaysDiscardsLateVideoFrames = true
                      if self.captureSession!.canAddOutput(dataOutput) {
                          self.captureSession!.addOutput(dataOutput)
                      }
                      self.captureSession!.commitConfiguration()
                      let queue: dispatch_queue_t = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL)
                      //let delegate = VideoDelegate()
                      self.dataOutput!.setSampleBufferDelegate(self, queue: queue)
                  }
                  
                  
                  
                  
                   func captureOutput(captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection) {
                      print("buffered")
                      let imageBuffer: CVImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer)!
                      CVPixelBufferLockBaseAddress(imageBuffer, 0)
                      let width: size_t = CVPixelBufferGetWidthOfPlane(imageBuffer, 0)
                      let height: size_t = CVPixelBufferGetHeightOfPlane(imageBuffer, 0)
                      let bytesPerRow: size_t = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 0)
                      let lumaBuffer: UnsafeMutablePointer = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0)
                      let grayColorSpace: CGColorSpaceRef = CGColorSpaceCreateDeviceGray()!
                      let context: CGContextRef = CGBitmapContextCreate(lumaBuffer, width, height, 8, bytesPerRow, grayColorSpace, CGImageAlphaInfo.PremultipliedLast.rawValue)!//problematic
                  
                      let dstImageFilter: CGImageRef = CGBitmapContextCreateImage(context)!
                      dispatch_sync(dispatch_get_main_queue(), {() -> Void in
                          customPreviewLayer!.contents = dstImageFilter as AnyObject
                      })
                  }
                  
                  
                  
                  }
                  

                  这篇关于未调用 didOutputSampleBuffer 委托的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持跟版网!

                  上一篇:如何在 1 个视图控制器中管理 2 个表视图? 下一篇:NSURLConnectionDelegate 连接:didReceiveData 不起作用

                  相关文章

                      <i id='DpPNo'><tr id='DpPNo'><dt id='DpPNo'><q id='DpPNo'><span id='DpPNo'><b id='DpPNo'><form id='DpPNo'><ins id='DpPNo'></ins><ul id='DpPNo'></ul><sub id='DpPNo'></sub></form><legend id='DpPNo'></legend><bdo id='DpPNo'><pre id='DpPNo'><center id='DpPNo'></center></pre></bdo></b><th id='DpPNo'></th></span></q></dt></tr></i><div id='DpPNo'><tfoot id='DpPNo'></tfoot><dl id='DpPNo'><fieldset id='DpPNo'></fieldset></dl></div>

                      <small id='DpPNo'></small><noframes id='DpPNo'>

                      • <bdo id='DpPNo'></bdo><ul id='DpPNo'></ul>
                      <tfoot id='DpPNo'></tfoot><legend id='DpPNo'><style id='DpPNo'><dir id='DpPNo'><q id='DpPNo'></q></dir></style></legend>