Skip to content
This repository has been archived by the owner on Aug 28, 2024. It is now read-only.

ImageSegmentation Build Failure on Mac M1 #51

Open
ZwwWayne opened this issue Apr 11, 2021 · 19 comments
Open

ImageSegmentation Build Failure on Mac M1 #51

ZwwWayne opened this issue Apr 11, 2021 · 19 comments

Comments

@ZwwWayne
Copy link

Hi, I met the compilation issue on Mac M1 as below

ld: in /Users/xxx/projects/ios-demo-app/ImageSegmentation/Pods/LibTorch/install/lib/libtorch.a(empty.cpp.o),
building for iOS Simulator, but linking in object file built for iOS,
file '/Users/xxx/projects/ios-demo-app/ImageSegmentation/Pods/LibTorch/install/lib/libtorch.a' for architecture arm64

The environment information is as below:
PyTorch: 1.8.1
Output of pod --version: 1.10.1
Cmake version: 3.11.3

I also tried to modify the Podfile from pod 'LibTorch', '~>1.7.0' to pod 'LibTorch', '~>1.8.0', but the failure still exists.

@Jignya-vc
Copy link

Jignya-vc commented Sep 29, 2021

did anyone find resolution for this issue ?

@darkThanBlack
Copy link

simple answer is add this to Podfile:

post_install do |installer|
  installer.pods_project.targets.each do |target|
    target.build_configurations.each do |config|
      config.build_settings['EXCLUDED_ARCHS[sdk=iphonesimulator*]'] = 'arm64'
      config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '10.0'
      config.build_settings['ENABLE_BITCODE'] = 'NO'
    end
  end
end

@kenza-djeddiali
Copy link

@darkThanBlack
hi, i added this code but it still not working, i am using Macbook pro M1 pro

platform :ios, '12.0'

target 'PTmodelTest' do
pod 'LibTorch', '~>1.7.0'
end

post_install do |installer|
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['EXCLUDED_ARCHS[sdk=iphonesimulator*]'] = 'arm64'
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '10.0'
config.build_settings['ENABLE_BITCODE'] = 'NO'
end
end
end

did anyone find resolution
Thank you

@darkThanBlack
Copy link

@kenza-djeddiali Can you paste your Xcode Error info or ScreenShot for me?

@kenza-djeddiali
Copy link

kenza-djeddiali commented Mar 28, 2022

hi @darkThanBlack , thank you for your feedback
there it is :
image

I changed parameters yesterday but I have a new error [ https://www.devfaq.fr/question/xcode-12-construction-pour-ios-simulator-mais-liaison-dans-un-fichier-objet-construit-pour-ios-pour-l-39-architecture-arm64]
image
this:
image

**
what i want , develop my detection model and integrate it into the app, i tried several options (create ML works super cool), convert pytorch to pt done (so i want to test it with pytorch mobile ), but pytorch to coreml no! tensorflow lite same problem with pod installation

// toto that I consulted: https://youtu.be/ca4RGvIY5cc, https://youtu.be/amTepUIR93k
**
thank you, have a nice day 🤗

@darkThanBlack
Copy link

@kenza-djeddiali

  • Try #import <Libtorch-Lite/Libtorch-Lite.h>
  • As a beginner, I made the same mistakes. My immediate conclusion is, don't try to convert machine learning models to each other. As iOS developers, We might habitually think of a model as a special kind of file or folder that can be interpreted and executed by multiple frameworks. Yes, yes, they look like they have really nice conversion tools.
    But in fact, in order to run well on a commercial-grade app, roughly we have to ensure that
  1. Input data format;
  2. Algorithm interface;
  3. Model file;
  4. The machine learning framework corresponding to the model;
  5. Output data analysis algorithm;
  6. etc.

It all depends on the algorithm engineer or the paper.
The reason is simple: the so-called framework is just a simple literal translation of the mathematical formulas and algorithms in the paper.
For example, in the pyTorch demo, we need to call the forward() method. but why? What is forward? What does forward mean?
There is also a size parameter: {1, 3, 224, 224}. but why? Why does the image size need to be 224? Can it be changed to 200? The answer is no. Even if it is changed to 223, the output data will be very different. All the parameters involved in this whole process are all damn BLACK MAGIC. PLEASE CARVE THIS POINT INTO YOUR DNA.
If we don't understand the whole process of machine learning, the only way is to ask the person who made the model and let him tell us how to set the specific parameters.

@kenza-djeddiali
Copy link

Hi @darkThanBlack, thank you for your feedback, it's a real headache, I'm back to error1 🤣.
(in /Users/ken/ProjetsIOS/PTmodelTest/Pods/LibTorch-Lite/install/lib/libtorch.a(empty.cpp.o), building for iOS Simulator, but linking in object file built for iOS, file '/Users/ken/ProjetsIOS/PTmodelTest/Pods/LibTorch-Lite/install/lib/libtorch.a' for architecture arm64)
I tried many many things, but I’m stuck (// ex. https://discuss.pytorch.org/t/cant-use-ios-libtorch-as-a-dependency-for-a-library-using-cocoapods/75544/5)

Actually it looks easy, convert ... but basically here is what happens, I exhausted all my cards lol, maybe switch to tensorflow lite... I couldn't run tensorflow model at the front on my M1 pro machine ! we'll see what I have to find!

thank you @darkThanBlack

@darkThanBlack
Copy link

@kenza-djeddiali Emm...Have you heard lipo -info? You can use it to check actual support in libtorch.a;Currect output might like this image, include i386 armv7 x86_64 arm64:
image

If not, I suggest you to give up using cocoaPods with PyTorch cause they didn't create .a file correctly, try custom build PyTorch by yourself, you can find the chapter of the same name in the pytorch guide, create your .a file and drag it info project. I will consider uploading my demo if I have time later.

@kenza-djeddiali
Copy link

kenza-djeddiali commented Apr 4, 2022

@darkThanBlack hi, i'm going to see that, actually file a is in red, on the forums a solution is to delete it, but i got error 1 back.
Thank you, I would like to have code that works 😅

@den-run-ai
Copy link

@darkThanBlack can you explain how your solutions works?

@darkThanBlack
Copy link

Demo project uploaded, LINK
@denfromufa
@kenza-djeddiali

@bqubique
Copy link

bqubique commented Jun 2, 2022

I used the pods from @darkThanBlack 's helloworld app, (pytorch-lite 1.9.0) and built on a physical device using Xcode and it worked.

@steve-ham
Copy link

I'm getting the same error on https://github.com/pytorch/ios-demo-app/tree/master/HelloWorld.

ld: in /Users/steve.ham/Downloads/ios-demo-app-master/HelloWorld/HelloWorld/Pods/LibTorch-Lite/install/lib/libtorch.a(empty.cpp.o), building for iOS Simulator, but linking in object file built for iOS, file '/Users/steve.ham/Downloads/ios-demo-app-master/HelloWorld/HelloWorld/Pods/LibTorch-Lite/install/lib/libtorch.a' for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)

Chip: Apple M1 Pro
Pod: 'LibTorch-Lite', '~> 1.10.0'
Device:

  1. Work on device iPhone 12 Pro.
  2. Doesn't work on iOS Simulator iPhone 13 mini.

Is any LibTorch developers trying to fix this?

@steve-ham
Copy link

Strangely if I install 'LibTorch-Lite' with 'OpenCV' the project runs on simulator too.

platform :ios, '12.0'
target 'HelloWorld' do
    pod 'LibTorch-Lite'
    pod 'OpenCV'
end

@mlynch
Copy link

mlynch commented Nov 1, 2022

@steve-ham does this actually work? I tried it and it fixes one compile error but causes another. I looked closer at the OpenCV pod and it's quite outdated and sets these excluded architectures for simulators which could just be masking the actual issue: https://github.com/CocoaPods/Specs/blob/master/Specs/5/b/d/OpenCV/4.3.0/OpenCV.podspec.json#L41

@steve-ham
Copy link

steve-ham commented Nov 4, 2022

@mlynch My above Podfile script doesn't work anymore somehow, however If I add below config on App Project Build Settings it works (not Pod Build Settings):

  1. Tap 'HelloWorld' project on top left.
  2. Tap 'HelloWorld' under 'TARGETS'.
  3. Add below config under 'Excluded Architectures'
    Debug
    Any iOS Simulator SDK arm64
    Release
    Any iOS Simulator SDK arm64

@trsa74
Copy link

trsa74 commented Nov 8, 2022

@steve-ham After following all the steps you said above, now I got this error. Can you suggest any solution? Thanks.

error

Error: Target Integrity
The linked framework 'Pods_Labelling_Board.framework' is missing one or more architectures required by this target: x86_64.

Error Location:
image

I am using pod 'LibTorch', '~>1.10.0'

@steve-ham
Copy link

@pbanavara
Copy link

This problem still persists. If you include arm64 in the excluded architectures, at the time of running the app, the assets or model files don't get copied and you get the LLDB error. Has anyone solved this for iOS simulator yet ?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants