A WWDC19 scholarship submission.
The project showcases how different animals (dogs, cats, bulls, birds, snakes and bees) see the world by simulating their eye sight with CoreImage filters applied to live camera preview in comparison to human eye sight.
Technologies used: AVFoundation, CoreGraphics, CoreLocation, CoreImage, Metal, SIMD, UIKit.
This project is distributed both as an app and a playground. The app was used during the development and debugging and the playground was used as the actual submission.
Running this project as an app requires Xcode 10.1 or newer and iOS 12.1 or newer. Both iPhone and iPad are supported as destination devices. After cloning the repository, open Project/Sight.xcodeproj, change the code signing settings in Configuration/Sight.xcconfig and run the app on your destination device.
Running this project as a playground requires Swift Playgrounds 2.2 or newer running on iOS 12.1 or newer. After cloning the repository, you can copy Playground/Sight.playground directly to Swift Playgrounds app.
This project was made with sweat 😪 and caffeine ☕️ with a tbsp of hope 🤞 by Adrian Kashivskyy.
This project is licensed under the Creative Commons BY-SA 4.0 License.
All photographs used in the project are licensed under the Pexels Photo License. All medical illustrations used in the project come from Wikimedia and are licensed under the CC-BY-SA-3.0 License. All icons used in the project come from Icons8. The redistribution of xcconfigs is licensed under the MIT License.