美文网首页
iOS中正确的截屏姿势

iOS中正确的截屏姿势

作者: TyroneTang | 来源:发表于2017-04-25 10:49 被阅读266次

时间 2014-12-22 09:20:48CocoaChina

原文http://www.cocoachina.com/ios/20141222/10713.html

主题iOS开发

昨天写了个用到截屏功能的插件,结果问题不断,今天终于解决好了,把debug过程中所有尝试过的截屏方法都贴出来吧~

第一种

这是iOS 3时代开始就被使用的方法,它被废止于iOS 7。iOS的私有方法,效率很高。

#importextern"C"CGImageRefUIGetScreenImage();UIImage* screenshot(void)NS_DEPRECATED_IOS(3_0,7_0);UIImage* screenshot(){UIImage*image = [UIImageimageWithCGImage:UIGetScreenImage()];returnimage;}

第二种

这是在比较常见的截图方法,不过不支持Retina屏幕。

UIImage* screenshot(UIView*);UIImage* screenshot(UIView*view){UIGraphicsBeginImageContext(view.frame.size);[view.layer renderInContext:UIGraphicsGetCurrentContext()];UIImage*image =UIGraphicsGetImageFromCurrentImageContext();UIGraphicsEndImageContext();returnimage;}

第三种

从iPhone 4、iPod Touch 4开始,Apple逐渐采用Retina屏幕,于是在iOS 4的SDK中我们有了,上面的截图方法也自然变成了这样。

UIImage* screenshot(UIView*)NS_AVAILABLE_IOS(4_0);UIImage* screenshot(UIView*view){if(UIGraphicsBeginImageContextWithOptions!=NULL){UIGraphicsBeginImageContextWithOptions(view.frame.size,NO,0.0);}else{UIGraphicsBeginImageContext(view.frame.size);}[view.layer renderInContext:UIGraphicsGetCurrentContext()];UIImage*image =UIGraphicsGetImageFromCurrentImageContext();UIGraphicsEndImageContext();returnimage;}

第四种

或许你会说有时Hook的是一个按钮的方法,用第三个方法的话,根本找不到view来传值,不过还好,iOS 7又提供了一些UIScreen的API。

UIImage* screenshot(void)NS_AVAILABLE_IOS(7_0);UIImage* screenshot(){UIView* view = [[UIScreenmainScreen] snapshotViewAfterScreenUpdates:YES];if(UIGraphicsBeginImageContextWithOptions!=NULL){UIGraphicsBeginImageContextWithOptions(view.frame.size,NO,0.0);}else{UIGraphicsBeginImageContext(view.frame.size);}[view.layer renderInContext:UIGraphicsGetCurrentContext()];UIImage*image =UIGraphicsGetImageFromCurrentImageContext();UIGraphicsEndImageContext();returnimage;}

第五种

@interfaceSBScreenShotter: NSObject+ (id)sharedInstance;-(void)saveScreenshot:(_Bool)arg1;@end

然后直接

[[SBScreenShotter sharedInstance] saveScreenshot:YES];

一道白光之后,咱们就模拟了用户截屏的动作,不过这个方法在只需要截屏时比较好,如果要对屏幕录像(其实就是不断截图)的话,那不得闪瞎了。。而且我们也拿不到UIImage的实例去拼成一个视频呀。即使通过Hook别的类拿到UIImage的实例,这个私有API的效率大概也是达不到30FPS的视频要求的。

那么现在我们有5种方法了,第一种是私有API,私有API通常效率和质量都比Documented API的好,可是它在iOS 7以后就被废除了啊,就没有别的了吗?

答案当然是————有的!用Private Framework来完成这项任务!直接走底层拿屏幕的缓冲数据,然后生成UIImage的实例。

第六种

#import #import #import #import #import extern"C"IOReturn IOSurfaceLock(IOSurfaceRef buffer, uint32_t options, uint32_t *seed);extern"C"IOReturn IOSurfaceUnlock(IOSurfaceRef buffer, uint32_t options, uint32_t *seed);extern"C"size_t IOSurfaceGetWidth(IOSurfaceRef buffer);extern"C"size_t IOSurfaceGetHeight(IOSurfaceRef buffer);extern"C"IOSurfaceRef IOSurfaceCreate(CFDictionaryRefproperties);extern"C"void*IOSurfaceGetBaseAddress(IOSurfaceRef buffer);extern"C"size_t IOSurfaceGetBytesPerRow(IOSurfaceRef buffer);externconstCFStringRefkIOSurfaceAllocSize;externconstCFStringRefkIOSurfaceWidth;externconstCFStringRefkIOSurfaceHeight;externconstCFStringRefkIOSurfaceIsGlobal;externconstCFStringRefkIOSurfaceBytesPerRow;externconstCFStringRefkIOSurfaceBytesPerElement;externconstCFStringRefkIOSurfacePixelFormat;enum{kIOSurfaceLockReadOnly  =0x00000001,kIOSurfaceLockAvoidSync =0x00000002};UIImage* screenshot(void);UIImage* screenshot(){IOMobileFramebufferConnection connect;kern_return_t result;CoreSurfaceBufferRef screenSurface =NULL;io_service_t framebufferService = IOServiceGetMatchingService(kIOMasterPortDefault,IOServiceMatching("AppleH1CLCD"));if(!framebufferService)framebufferService = IOServiceGetMatchingService(kIOMasterPortDefault,IOServiceMatching("AppleM2CLCD"));if(!framebufferService)framebufferService = IOServiceGetMatchingService(kIOMasterPortDefault,IOServiceMatching("AppleCLCD"));result = IOMobileFramebufferOpen(framebufferService, mach_task_self(),0, &connect);result = IOMobileFramebufferGetLayerDefaultSurface(connect,0, &screenSurface);uint32_t aseed;IOSurfaceLock((IOSurfaceRef)screenSurface,0x00000001, &aseed);size_t width = IOSurfaceGetWidth((IOSurfaceRef)screenSurface);size_t height = IOSurfaceGetHeight((IOSurfaceRef)screenSurface);CFMutableDictionaryRefdict;size_t pitch = width*4, size = width*height*4;intbPE=4;charpixelFormat[4] = {'A','R','G','B'};dict =CFDictionaryCreateMutable(kCFAllocatorDefault,0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks);CFDictionarySetValue(dict, kIOSurfaceIsGlobal, kCFBooleanTrue);CFDictionarySetValue(dict, kIOSurfaceBytesPerRow,CFNumberCreate(kCFAllocatorDefault,kCFNumberSInt32Type, &pitch));CFDictionarySetValue(dict, kIOSurfaceBytesPerElement,CFNumberCreate(kCFAllocatorDefault,kCFNumberSInt32Type, &bPE));CFDictionarySetValue(dict, kIOSurfaceWidth,CFNumberCreate(kCFAllocatorDefault,kCFNumberSInt32Type, &width));CFDictionarySetValue(dict, kIOSurfaceHeight,CFNumberCreate(kCFAllocatorDefault,kCFNumberSInt32Type, &height));CFDictionarySetValue(dict, kIOSurfacePixelFormat,CFNumberCreate(kCFAllocatorDefault,kCFNumberSInt32Type, pixelFormat));CFDictionarySetValue(dict, kIOSurfaceAllocSize,CFNumberCreate(kCFAllocatorDefault,kCFNumberSInt32Type, &size));IOSurfaceRef destSurf = IOSurfaceCreate(dict);IOSurfaceAcceleratorRef outAcc;IOSurfaceAcceleratorCreate(NULL,0, &outAcc);IOSurfaceAcceleratorTransferSurface(outAcc, (IOSurfaceRef)screenSurface, destSurf, dict,NULL);IOSurfaceUnlock((IOSurfaceRef)screenSurface, kIOSurfaceLockReadOnly, &aseed);CFRelease(outAcc);CGDataProviderRefprovider =CGDataProviderCreateWithData(NULL,  IOSurfaceGetBaseAddress(destSurf), (width * height *4),NULL);CGImageRefcgImage =CGImageCreate(width, height,8,8*4, IOSurfaceGetBytesPerRow(destSurf),CGColorSpaceCreateDeviceRGB(), kCGImageAlphaNoneSkipFirst |kCGBitmapByteOrder32Little,provider,NULL,YES, kCGRenderingIntentDefault);UIImage*image = [UIImageimageWithCGImage:cgImage];returnimage;}

需要注意的是,第五种方法需要修改一下IOMobileFramebuffer的头文件。

typedefvoid* IOMobileFramebufferConnection;

In the reversed header, IOMobileFramebufferConnection is typedef'd to io_connect_t, which is typedef'd to io_object_t, which is mach_port_t, which is __darwin_mach_port_t, which is __darwin_mach_port_name_t, which is __darwin_natural_t, which is unsigned int! Int just happens to be pointer-sized on 32-bit, but is not under 64-bit。

——StackoverFlow

修改好的头文件顺便也丢上来吧,解压后放在Project的根目录下。

如果你使用的是theos的话,记得在Makefile里写上,

YOUR_TWEAK_NAME_PRIVATE_FRAMEWORKS = IOSurface IOKit IOMobileFramebuffer

YOUR_TWEAK_NAME_CFLAGS = -I./headers/ -I./headers/IOSurface

如果是XCode上的Logos Tweak的话,在Build Settings -> Search Paths -> Header Search Paths里面添加一项:$(PROJECT_DIR)/YOUR_PROJECT_NAME/headers, 搜索方式为recursive. 最后在Build Phases里Link上IOSurface IOKit IOMobileFramebuffer这三个私有Framework。

相关文章

  • iOS中正确的截屏姿势

    时间2014-12-22 09:20:48CocoaChina 原文http://www.cocoachina.c...

  • (最新)iOS截屏

    ios webview 截屏:ios截屏 前言:介绍一下截屏有很多种做法1:截当前屏幕内容2:截整个视图的所有内容...

  • flutter:截屏

    1.flutter-截屏组件 2.flutter-截屏插件 3.flutter-iOS原生截屏 iOS代码 4.获...

  • ios截屏

    ios截屏

  • 用 Swift 架构 iOS 应用的正确姿势

    用 Swift 架构 iOS 应用的正确姿势 用 Swift 架构 iOS 应用的正确姿势

  • iOS 应用内截屏分享

    需求:捕获用户截屏操作,并建议用户截屏后的操作。虽然iOS11 有系统的截屏,但 APP 内截屏可便捷操作。 封装...

  • iOS 截屏&长截屏

    截屏在 iOS 开发中经常用到,本篇文章讲的是监听用户截屏操作,并且获取截屏图片,如果当前是UIScrollVie...

  • 关于iOS 13中TableView截屏内容不完整问题

    项目中有截屏功能,在iOS13之前,截屏的时候,可以将TableView中的内容都可以截取出来,包括在屏幕中不显示...

  • iOS屏幕截图功能

    iOS7.0之前的系统,可以通过以下代码实现截屏功能。 iOS7.0之后,系统中封装了截屏的方法- (UIView...

  • Unity踩坑日志:关于unity与iOS交互的坑

    在原生中调用unity导出的iOS工程 1.一个需求是进行截屏,同时隐藏UI,截屏之后显示UI,并显示“已保存至系...

网友评论

      本文标题:iOS中正确的截屏姿势

      本文链接:https://www.haomeiwen.com/subject/arxvzttx.html