Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Analysis Revamp #1624

Merged
merged 30 commits into from
Dec 30, 2018
Merged

Analysis Revamp #1624

merged 30 commits into from
Dec 30, 2018

Conversation

uklotzde
Copy link
Contributor

@uklotzde uklotzde commented Apr 18, 2018

The successor of #1413

Related bug reports and feature requests:
https://bugs.launchpad.net/mixxx/+bug/1737537
https://bugs.launchpad.net/mixxx/+bug/1547916
https://bugs.launchpad.net/mixxx/+bug/1738227
https://bugs.launchpad.net/mixxx/+bug/1641153

Only the last commit contains the substantial changes to the analysis framework and finally enables multi-threaded batch analysis. All preceding commits contain smaller fixes and improvements that might qualify for inclusion in a 2.1.x fix release.

Still based on and target towards 2.1, because rebasing on master causes merge conflicts. Will not be rebased on master until all cherries for 2.1 have been picked.

@uklotzde uklotzde changed the base branch from master to 2.1 April 18, 2018 14:52
@uklotzde uklotzde changed the title Analysis Revamp [Don't merge] Analysis Revamp Apr 18, 2018
@uklotzde uklotzde added this to the 2.2.0 milestone Apr 18, 2018
@Be-ing
Copy link
Contributor

Be-ing commented Apr 18, 2018

I just resolved the small merge conflict between 2.1 and master. This rebases cleanly on master.

@uklotzde
Copy link
Contributor Author

uklotzde commented Apr 18, 2018

I've split off 2 potential candidates for 2.1.x:
#1625 Fidlib: Highly recommended! I've experienced a crash at least twice while playing until I patched Mixxx with this fix.
#1626 Vamp: Optional. Can only occur when running both an ad-hoc and and a batch analysis concurrently.

@uklotzde uklotzde changed the title [Don't merge] Analysis Revamp Analysis Revamp May 21, 2018
@uklotzde uklotzde mentioned this pull request May 21, 2018
@Be-ing Be-ing changed the base branch from 2.1 to master June 11, 2018 21:10
@Be-ing
Copy link
Contributor

Be-ing commented Jun 11, 2018

@uklotzde there is a small merge conflict with master.
@daschuer do you want to review this for 2.2.0?

@Be-ing
Copy link
Contributor

Be-ing commented Jun 23, 2018

There are some merge conflicts between this and the library redesign branch. Is this still a candidate for 2.2 or should we push it back to 2.3?

@uklotzde
Copy link
Contributor Author

This work is finished and has proven to be stable for me. I will not rebase it on any feature branch with yet unresolved issues.

@Be-ing Be-ing modified the milestones: 2.2.0, 2.3.0 Jun 24, 2018
@Be-ing
Copy link
Contributor

Be-ing commented Jun 24, 2018

Okay, let's push it back to 2.3 then.

@Be-ing Be-ing mentioned this pull request Jun 27, 2018
3 tasks
@Be-ing
Copy link
Contributor

Be-ing commented Jun 27, 2018

@daschuer what are your thoughts about this? Could we merge it for 2.2?

@daschuer
Copy link
Member

I will have no time to review this before beta.

Copy link
Member

@daschuer daschuer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for working on this and sorry for the long delay reviewing it.
I have done a first read. Please do NOT rebase this during review.

src/analyzer/analyzerprogress.h Outdated Show resolved Hide resolved
src/analyzer/analyzerthread.cpp Outdated Show resolved Hide resolved
src/analyzer/analyzerthread.cpp Show resolved Hide resolved
src/analyzer/analyzerthread.cpp Outdated Show resolved Hide resolved
src/analyzer/analyzerthread.cpp Outdated Show resolved Hide resolved
src/analyzer/trackanalysisscheduler.cpp Outdated Show resolved Hide resolved
src/control/controlvalue.h Outdated Show resolved Hide resolved
src/control/controlvalue.h Outdated Show resolved Hide resolved
src/control/controlvalue.h Outdated Show resolved Hide resolved
src/analyzer/analyzerthread.cpp Outdated Show resolved Hide resolved
@Be-ing
Copy link
Contributor

Be-ing commented Dec 27, 2018

There is a deadlock deleting a track from GlobalTrackCache if batch analysis is running while Mixxx is shutting down. I can reproduce this every time. If I load an unanalyzed track to a deck then immediately quit Mixxx, there is no deadlock. If I stop batch analysis before quitting Mixxx, there is no deadlock. Log:

Debug [Main]: in ~EngineMaster()
Debug [Main]: 740 ms deleting DlgPreferences
Debug [Main]: 835 ms deleting EffectsManager
Debug [Main]: GlobalTrackCache - Evicting track ["/home/be/music/yes/Yussef Kamaal - Black Focus/Yussef Kamaal - Black Focus - 07 Lowrider.flac" | "/home/be/music/yes/Yussef Kamaal - Black Focus/Yussef Kamaal - Black Focus - 07 Lowrider.flac" | 2604] Track(0x2e9cc370)
Debug [Main]: GlobalTrackCache - Skip to save an already evicted track Track(0x2e9cc370)
Debug [Main]: GlobalTrackCache - Deleting Track(0x2e9cc370)
Debug [Main]: GlobalTrackCache - Evicting track ["/home/be/music/yes/Yussef Kamaal - Black Focus/Yussef Kamaal - Black Focus - 09 WingTai Drums.flac" | "/home/be/music/yes/Yussef Kamaal - Black Focus/Yussef Kamaal - Black Focus - 09 WingTai Drums.flac" | 2601] Track(0x2e341980)
Debug [Main]: GlobalTrackCache - Skip to save an already evicted track Track(0x2e341980)
Debug [Main]: GlobalTrackCache - Deleting Track(0x2e341980)
Debug [Main]: 930 ms deleting SettingsManager
Debug [Main]: TrackAnalysisScheduler - Destroying
Debug [Main]: TrackAnalysisScheduler - Destroying
Debug [Main]: Mixxx shutdown complete with code 0
GlobalTrackCache - Deleting Track(0x2e724920)

Backtrace:

(gdb) thread apply all bt

Thread 8 (Thread 0x7f85d5749700 (LWP 5430)):
#0  0x00007f860065d73c in pthread_cond_wait@@GLIBC_2.3.2 ()
    at /usr/lib64/libpthread.so.0
#1  0x00007f85d626ab93 in  () at /usr/lib64/dri/i965_dri.so
#2  0x00007f85d626a8eb in  () at /usr/lib64/dri/i965_dri.so
#3  0x00007f860065758e in start_thread () at /usr/lib64/libpthread.so.0
#4  0x00007f85ffcb76a3 in clone () at /usr/lib64/libc.so.6

Thread 7 (Thread 0x7f85d77fe700 (LWP 5429)):
#0  0x00007f860065d73c in pthread_cond_wait@@GLIBC_2.3.2 ()
    at /usr/lib64/libpthread.so.0
#1  0x00007f8602294f5a in QTWTF::TCMalloc_PageHeap::scavengerThread() ()
    at /usr/lib64/libQt5Script.so.5
#2  0x00007f8602294fc5 in  () at /usr/lib64/libQt5Script.so.5
#3  0x00007f860065758e in start_thread () at /usr/lib64/libpthread.so.0
#4  0x00007f85ffcb76a3 in clone () at /usr/lib64/libc.so.6

Thread 6 (Thread 0x7f85d7fff700 (LWP 5427)):
#0  0x00007f85ffcac421 in poll () at /usr/lib64/libc.so.6
#1  0x00007f86001343a6 in  () at /usr/lib64/libglib-2.0.so.0
#2  0x00007f86001344d0 in g_main_context_iteration ()
    at /usr/lib64/libglib-2.0.so.0
#3  0x00007f86010df813 in QEventDispatcherGlib::processEvents(QFlags<QEventLoop::ProcessEventsFlag>) (this=0x7f85d0000b20, flags=...)
    at kernel/qeventdispatcher_glib.cpp:423
#4  0x00007f860108e17b in QEventLoop::exec(QFlags<QEventLoop::ProcessEventsFlag>) (this=this@entry=0x7f85d7ffebf0, flags=..., flags@entry=...)
    at ../../include/QtCore/../../src/corelib/global/qflags.h:140
#5  0x00007f8600ef6046 in QThread::exec()
    (this=this@entry=0x7f86013af060 <(anonymous namespace)::Q_QGS__q_manager::innerFunction()::holder>)
    at ../../include/QtCore/../../src/corelib/global/qflags.h:120
#6  0x00007f8601333f89 in QDBusConnectionManager::run()
    (this=0x7f86013af060 <(anonymous namespace)::Q_QGS__q_manager::innerFunction()::holder>) at qdbusconnection.cpp:178
#7  0x00007f8600eff4bb in QThreadPrivate::start(void*)
    (arg=0x7f86013af060 <(anonymous namespace)::Q_QGS__q_manager::innerFunction()::holder>) at thread/qthread_unix.cpp:367
#8  0x00007f860065758e in start_thread () at /usr/lib64/libpthread.so.0
#9  0x00007f85ffcb76a3 in clone () at /usr/lib64/libc.so.6

Thread 5 (Thread 0x7f85ecc19700 (LWP 5426)):
#0  0x00007f85ffcac421 in poll () at /usr/lib64/libc.so.6
#1  0x00007f86001343a6 in  () at /usr/lib64/libglib-2.0.so.0
#2  0x00007f8600134762 in g_main_loop_run () at /usr/lib64/libglib-2.0.so.0
#3  0x00007f85fd2eb10a in  () at /usr/lib64/libgio-2.0.so.0
#4  0x00007f860015d2aa in  () at /usr/lib64/libglib-2.0.so.0
#5  0x00007f860065758e in start_thread () at /usr/lib64/libpthread.so.0
#6  0x00007f85ffcb76a3 in clone () at /usr/lib64/libc.so.6

Thread 4 (Thread 0x7f85edc1d700 (LWP 5424)):
#0  0x00007f85ffcac421 in poll () at /usr/lib64/libc.so.6
#1  0x00007f86001343a6 in  () at /usr/lib64/libglib-2.0.so.0
#2  0x00007f86001344d0 in g_main_context_iteration ()
    at /usr/lib64/libglib-2.0.so.0
#3  0x00007f85ee436c7d in  () at /usr/lib64/gio/modules/libdconfsettings.so
#4  0x00007f860015d2aa in  () at /usr/lib64/libglib-2.0.so.0
#5  0x00007f860065758e in start_thread () at /usr/lib64/libpthread.so.0
#6  0x00007f85ffcb76a3 in clone () at /usr/lib64/libc.so.6

Thread 3 (Thread 0x7f85ee41e700 (LWP 5423)):
#0  0x00007f85ffcac421 in poll () at /usr/lib64/libc.so.6
#1  0x00007f86001343a6 in  () at /usr/lib64/libglib-2.0.so.0
#2  0x00007f86001344d0 in g_main_context_iteration ()
    at /usr/lib64/libglib-2.0.so.0
#3  0x00007f8600134521 in  () at /usr/lib64/libglib-2.0.so.0
#4  0x00007f860015d2aa in  () at /usr/lib64/libglib-2.0.so.0
#5  0x00007f860065758e in start_thread () at /usr/lib64/libpthread.so.0
#6  0x00007f85ffcb76a3 in clone () at /usr/lib64/libc.so.6

Thread 2 (Thread 0x7f85efa2e700 (LWP 5422)):
#0  0x00007f85ffcb1efd in syscall () at /usr/lib64/libc.so.6
#1  0x00007f8600ef2b85 in QtLinuxFutex::_q_futex(int*, int, int, unsigned long long, int*, int) (val3=0, addr2=0x0, val2=0, val=3, op=0, addr=0x16bb960)
    at thread/qfutex_p.h:105
#2  0x00007f8600ef2b85 in QtLinuxFutex::futexWait<QBasicAtomicPointer<QMutexData> >(QBasicAtomicPointer<QMutexData>&, QBasicAtomicPointer<QMutexData>::Type)
    (expectedValue=0x3, futex=...) at thread/qfutex_p.h:107
#3  0x00007f8600ef2b85 in lockInternal_helper<false>
    (timeout=-1, elapsedTimer=0x0, d_ptr=...) at thread/qmutex_linux.cpp:145
#4  0x00007f8600ef2b85 in QBasicMutex::lockInternal()
    (this=this@entry=0x16bb960) at thread/qmutex_linux.cpp:162
#5  0x00007f8600ef2be8 in QMutex::lock() (this=this@entry=0x16bb960)
    at thread/qmutex.cpp:229
#6  0x00007f86010928f5 in QCoreApplication::postEvent(QObject*, QEvent*, int)
    (receiver=0x1711050, event=event@entry=0x7f85e8016f00, priority=priority@entry=0) at kernel/qcoreapplication.cpp:1459
#7  0x00007f86010b7187 in queued_activate
    (locker=<synthetic pointer>..., argv=0x7f85efa2dc00, c=0x1710f30, signal=5, sender=0x17217a0) at kernel/qobject.cpp:3624
#8  0x00007f86010b7187 in QMetaObject::activate(QObject*, int, int, void**)
    (sender=0x17217a0, signalOffset=<optimized out>, local_signal_index=<optimized out>, argv=<optimized out>) at kernel/qobject.cpp:3723
#9  0x00007f85efc1fdf5 in QXcbEventReader::run() (this=0x17217a0)
    at qxcbconnection.cpp:1394
#10 0x00007f8600eff4bb in QThreadPrivate::start(void*) (arg=0x17217a0)
    at thread/qthread_unix.cpp:367
#11 0x00007f860065758e in start_thread () at /usr/lib64/libpthread.so.0
#12 0x00007f85ffcb76a3 in clone () at /usr/lib64/libc.so.6

Thread 1 (Thread 0x7f85fcd03980 (LWP 5421)):
#0  0x00007f85ffcb1efd in syscall () at /usr/lib64/libc.so.6
#1  0x00007f8600ef2b85 in QtLinuxFutex::_q_futex(int*, int, int, unsigned long long, int*, int) (val3=0, addr2=0x0, val2=0, val=3, op=0, addr=0x16bb960)
    at thread/qfutex_p.h:105
#2  0x00007f8600ef2b85 in QtLinuxFutex::futexWait<QBasicAtomicPointer<QMutexData> >(QBasicAtomicPointer<QMutexData>&, QBasicAtomicPointer<QMutexData>::Type)
    (expectedValue=0x3, futex=...) at thread/qfutex_p.h:107
#3  0x00007f8600ef2b85 in lockInternal_helper<false>
    (timeout=-1, elapsedTimer=0x0, d_ptr=...) at thread/qmutex_linux.cpp:145
#4  0x00007f8600ef2b85 in QBasicMutex::lockInternal()
    (this=this@entry=0x16bb960) at thread/qmutex_linux.cpp:162
#5  0x00007f8600ef2be8 in QMutex::lock() (this=this@entry=0x16bb960)
    at thread/qmutex.cpp:229
#6  0x00007f86010928f5 in QCoreApplication::postEvent(QObject*, QEvent*, int)
    (receiver=receiver@entry=0x2e724920, event=0x196d1c0, priority=priority@entry=0) at kernel/qcoreapplication.cpp:1459
#7  0x00007f86010b59f5 in QObject::deleteLater() (this=this@entry=0x2e724920)
    at kernel/qobject.cpp:2172
#8  0x0000000000600ce8 in (anonymous namespace)::deleteTrack(Track*)
    (plainPtr=0x2e724920) at src/track/globaltrackcache.cpp:77
#9  0x00000000006087ef in std::_Sp_counted_base<(__gnu_cxx::_Lock_policy)2>::_M_release() (this=0x2e35f6f0) at /usr/include/c++/8/bits/shared_ptr_base.h:148
#10 0x00000000006087ef in std::_Sp_counted_base<(__gnu_cxx::_Lock_policy)2>::_M_release() (this=0x2e35f6f0) at /usr/include/c++/8/bits/shared_ptr_base.h:148
#11 0x00000000006087ef in std::__shared_count<(__gnu_cxx::_Lock_policy)2>::~__shared_count() (this=<optimized out>, __in_chrg=<optimized out>)
    at /usr/include/c++/8/bits/shared_ptr_base.h:708
#12 0x00000000006087ef in std::__shared_ptr<GlobalTrackCacheEntry, (__gnu_cxx::_Lock_policy)2>::~__shared_ptr()
    (this=<optimized out>, __in_chrg=<optimized out>)
    at /usr/include/c++/8/bits/shared_ptr_base.h:1147
#13 0x00000000006087ef in std::shared_ptr<GlobalTrackCacheEntry>::~shared_ptr()
    (this=<optimized out>, __in_chrg=<optimized out>)
    at /usr/include/c++/8/bits/shared_ptr.h:103
#14 0x00000000006087ef in QtMetaTypePrivate::QMetaTypeFunctionHelper<std::shared_ptr<GlobalTrackCacheEntry>, true>::Destruct(void*) (t=<optimized out>)
    at /usr/include/qt5/QtCore/qmetatype.h:770
#15 0x00007f86010a2ffd in QMetaType::destruct(void*) const
    (data=0x7f84984c2b90, this=0x7ffefcaa73d0) at kernel/qmetatype.h:2221
#16 0x00007f86010a2ffd in QMetaType::destroy(int, void*)
    (type=<optimized out>, data=0x7f84984c2b90) at kernel/qmetatype.cpp:1811
#17 0x00007f86010b4ec9 in QMetaCallEvent::~QMetaCallEvent()
    (this=0x7f8498036a70, __in_chrg=<optimized out>) at kernel/qobject.cpp:485
#18 0x00007f86010b4f3d in QMetaCallEvent::~QMetaCallEvent()
    (this=0x7f8498036a70, __in_chrg=<optimized out>) at kernel/qobject.cpp:480
#19 0x00007f8601091aca in QCoreApplicationPrivate::cleanupThreadData()
    (this=this@entry=0x16fe2d0) at kernel/qcoreapplication.cpp:528
#20 0x00007f86014f98d7 in QGuiApplicationPrivate::~QGuiApplicationPrivate()
    (this=0x16fe2d0, __in_chrg=<optimized out>)
    at kernel/qguiapplication.cpp:1601
#21 0x00007f8601a4fb6d in QApplicationPrivate::~QApplicationPrivate()
    (this=0x16fe2d0, __in_chrg=<optimized out>) at kernel/qapplication.cpp:176
#22 0x00007f86010be66b in QScopedPointerDeleter<QObjectData>::cleanup(QObjectData*) (pointer=<optimized out>)
    at ../../include/QtCore/../../src/corelib/tools/qscopedpointer.h:52
#23 0x00007f86010be66b in QScopedPointer<QObjectData, QScopedPointerDeleter<QObjectData> >::~QScopedPointer() (this=0x7ffefcaa7728, __in_chrg=<optimized out>)
    at ../../include/QtCore/../../src/corelib/tools/qscopedpointer.h:107
#24 0x00007f86010be66b in QObject::~QObject()
    (this=<optimized out>, __in_chrg=<optimized out>) at kernel/qobject.cpp:884
#25 0x00007f860109169e in QCoreApplication::~QCoreApplication()
    (this=0x7ffefcaa7720, __in_chrg=<optimized out>)
    at ../../include/QtCore/../../src/corelib/tools/qstringlist.h:99
#26 0x00007f86014fbc0d in QGuiApplication::~QGuiApplication()
    (this=0x7ffefcaa7720, __in_chrg=<optimized out>)
    at kernel/qguiapplication.cpp:676
#27 0x00007f8601a51cc4 in QApplication::~QApplication()
    (this=0x7ffefcaa7720, __in_chrg=<optimized out>)
    at kernel/qapplication.cpp:857
#28 0x0000000000512ef8 in main(int, char**)
    (argc=<optimized out>, argv=<optimized out>) at src/main.cpp:142

@uklotzde
Copy link
Contributor Author

The deadlock should be fixed and Mixxx should shutdown cleanly even when executing an ad-hoc analysis (PlayerManager) or a batch analysis (AnalysisFeature).

The shutdown procedure is still a weak point in our code with all the runtime dependencies between various components.

src/mixxx.cpp Outdated Show resolved Hide resolved
@Be-ing
Copy link
Contributor

Be-ing commented Dec 28, 2018

I confirm the deadlock on shutdown when batch analyzing is fixed. I cannot reproduce the issue with the audio that @WaylonR reported. I'll let my computer finish analyzing ~4000 tracks and if that goes fine, I think this is ready to merge.

The shutdown procedure is still a weak point in our code with all the runtime dependencies between various components.

#941 is a step in the right direction for cleaning that up.

@Be-ing
Copy link
Contributor

Be-ing commented Dec 29, 2018

After 7 hours of analysis, Mixxx crashed when the last track was shown at 95% completion:

Debug [AnalyzerThread 6 #11]: AnalyzerWaveform - ~AnalyzerWaveform():
Debug [AnalyzerThread 6 #11]: AnalyzerThread - Exiting worker thread
Debug [AnalyzerThread 6 #11]: DbConnection - Closing database connection: "MIXXX-13" QSqlDatabase(driver="QSQLITE", database="/home/be/mixxx-settings-test/mixxxdb.sqlite", host="localhost", port=-1, user="mixxx", open=true)
Debug [AnalyzerThread 6 #11]: AnalyzerThread 6 - Exiting
Debug [Main]: AnalyzerThread 7 - Stopping
Debug [Main]: AnalyzerThread 7 - Waking up
Debug [AnalyzerThread 7 #12]: AnalyzerWaveform - ~AnalyzerWaveform():
Debug [AnalyzerThread 7 #12]: AnalyzerThread - Exiting worker thread
Debug [AnalyzerThread 7 #12]: DbConnection - Closing database connection: "MIXXX-14" QSqlDatabase(driver="QSQLITE", database="/home/be/mixxx-settings-test/mixxxdb.sqlite", host="localhost", port=-1, user="mixxx", open=true)
Debug [AnalyzerThread 7 #12]: AnalyzerThread 7 - Exiting
Debug [Main]: GlobalTrackCache - Evicting track ["/home/be/music/yes/Yussef Kamaal - Black Focus/Yussef Kamaal - Black Focus - 10 Joint 17.flac" | "/home/be/music/yes/Yussef Kamaal - Black Focus/Yussef Kamaal - Black Focus - 10 Joint 17.flac" | 2603] Track(0x236cf0e0)
Debug [Main]: TrackDAO: Saving track 2603 "/home/be/music/yes/Yussef Kamaal - Black Focus/Yussef Kamaal - Black Focus - 10 Joint 17.flac"
Debug [Main]: TrackDAO: Updating track in database 2603 "/home/be/music/yes/Yussef Kamaal - Black Focus/Yussef Kamaal - Black Focus - 10 Joint 17.flac"
Debug [Main]: SqlTransaction - Started new SQL database transaction on "MIXXX-1"
Debug [Main]: SqlTransaction - Committed SQL database transaction on "MIXXX-1"
Debug [Main]: BaseTrackCache(0x3ff2220) updateIndexWithQuery took 0 ms
Debug [Main]: GlobalTrackCache - Deleting Track(0x236cf0e0)
terminate called after throwing an instance of 'std::out_of_range'
  what():  vector::_M_range_check: __n (which is 5) >= this->size() (which is 0)
fish: “mixxx --resourcePath ~/sw/mixxx…” terminated by signal SIGABRT (Abort)

@Be-ing
Copy link
Contributor

Be-ing commented Dec 29, 2018

This crash is reproducible every time I analyze my whole library. I have 4046 tracks. Backtrace:

Thread 1 (Thread 0x7ffff2709e40 (LWP 30457)):
#0  0x00007ffff528553f in raise () at /lib64/libc.so.6
#1  0x00007ffff526f895 in abort () at /lib64/libc.so.6
#2  0x00007ffff5670e9b in  () at /lib64/libstdc++.so.6
#3  0x00007ffff56772fc in  () at /lib64/libstdc++.so.6
#4  0x00007ffff5677357 in  () at /lib64/libstdc++.so.6
#5  0x00007ffff56775b8 in  () at /lib64/libstdc++.so.6
#6  0x00007ffff5672e65 in  () at /lib64/libstdc++.so.6
#7  0x00000000009e2a26 in std::vector<TrackAnalysisScheduler::Worker, std::allocator<TrackAnalysisScheduler::Worker> >::_M_range_check(unsigned long) const (__n=<optimized out>, this=<optimized out>) at src/analyzer/trackanalysisscheduler.cpp:194
#8  0x00000000009e2a26 in std::vector<TrackAnalysisScheduler::Worker, std::allocator<TrackAnalysisScheduler::Worker> >::at(unsigned long) (__n=<optimized out>, this=<optimized out>) at /usr/include/c++/8/bits/stl_vector.h:981
#9  0x00000000009e2a26 in TrackAnalysisScheduler::onWorkerThreadProgress(int, AnalyzerThreadState, TrackId, double)
    (this=<optimized out>, threadId=<optimized out>, threadState=<optimized out>, trackId=..., analyzerProgress=<optimized out>)
    at src/analyzer/trackanalysisscheduler.cpp:178
#10 0x00007ffff6734392 in QObject::event(QEvent*) () at /home/be/local/lib/libQt5Core.so.5
#11 0x00007ffff71a7ab1 in QApplicationPrivate::notify_helper(QObject*, QEvent*) () at /home/be/local/lib/libQt5Widgets.so.5
#12 0x00007ffff71aebf0 in QApplication::notify(QObject*, QEvent*) () at /home/be/local/lib/libQt5Widgets.so.5
#13 0x00007ffff670b2c9 in QCoreApplication::notifyInternal2(QObject*, QEvent*) () at /home/be/local/lib/libQt5Core.so.5
#14 0x00007ffff670e177 in QCoreApplicationPrivate::sendPostedEvents(QObject*, int, QThreadData*) ()
    at /home/be/local/lib/libQt5Core.so.5
#15 0x00007ffff675d183 in  () at /home/be/local/lib/libQt5Core.so.5
#16 0x00007ffff57cd06d in g_main_context_dispatch () at /lib64/libglib-2.0.so.0
#17 0x00007ffff57cd438 in  () at /lib64/libglib-2.0.so.0
#18 0x00007ffff57cd4d0 in g_main_context_iteration () at /lib64/libglib-2.0.so.0
#19 0x00007ffff675c7f3 in QEventDispatcherGlib::processEvents(QFlags<QEventLoop::ProcessEventsFlag>) ()
    at /home/be/local/lib/libQt5Core.so.5
#20 0x00007ffff670a0db in QEventLoop::exec(QFlags<QEventLoop::ProcessEventsFlag>) () at /home/be/local/lib/libQt5Core.so.5
#21 0x00007ffff6711e3e in QCoreApplication::exec() () at /home/be/local/lib/libQt5Core.so.5
#22 0x0000000000515216 in (anonymous namespace)::runMixxx (args=..., app=0x7fffffffdfa0) at src/main.cpp:53
#23 0x0000000000515216 in main(int, char**) (argc=<optimized out>, argv=<optimized out>) at src/main.cpp:136

@Be-ing
Copy link
Contributor

Be-ing commented Dec 29, 2018

The crash occurs even if I only select a single track to analyze for batch analysis.

@Be-ing
Copy link
Contributor

Be-ing commented Dec 29, 2018

The TrackAnalysisScheduler::Pointer deleter calls TrackAnalysisScheduler::stop, which clears the m_workers vector as of dcf6301, before the last call to TrackAnalysisScheduler::onWorkerThreadProgress. I'll leave it to you to figure out the best way out of this.

Side note: using function pointers for signal & slot connections makes it so much easier to find where signals & slots are used! KDevelop's "Show uses" feature now works as expected! :D

@uklotzde
Copy link
Contributor Author

Deleting the vector of workers before the worker threads have been terminated was simply wrong. Reverted the change. This does not affect the shutdown behavior, i.e. no deadlock.

@Be-ing
Copy link
Contributor

Be-ing commented Dec 29, 2018

Okay, quitting Mixxx while batch analysis is running and letting batch analysis finish both seem to work now. However, I encountered a bug which I am attempting to reproduce. When I had all tracks analyzed, then tried to reanalyze them all, the progress text showed it was stuck at the last track (everything else continued to work). I have not been able to reproduce this yet since restarting Mixxx from the first time it happened. I suspect it may have been caused by my test database getting into a corrupted state from testing this branch with the previous bugs. I made a fresh database and am letting batch analysis run on my whole library again.

@uklotzde
Copy link
Contributor Author

@Be-ing I'm not able to get the progress display stuck at near the end, independent of how many tracks are analyzed.

The shutdown process is still a bumpy road and I'm not able to reason about if all threads are finished in time in order to prevent a segfault. As long as we don't experience any segfaults on shutdown I consider the current implementation safe enough. Hopefully we can improve this situation in the future once all components follow a common life cycle and are orchestrated through an asynchronous startup/shutdown protocol.

@Be-ing
Copy link
Contributor

Be-ing commented Dec 30, 2018

I can't reproduce that issue with a fresh settings directory. Great work, thank you!

@Be-ing Be-ing merged commit b9c9b7f into mixxxdj:master Dec 30, 2018
@Gby56
Copy link

Gby56 commented Apr 13, 2019

Hi !
Is there any news concerning the multi-threaded analysis ?
https://bugs.launchpad.net/mixxx/+bug/1641153 traces back the story of the feature, it has been superseeded but still not merged apparently, or is it ?
The analysis seems quite slow and my CPU is at 100% on a single core so it seems to be single thread for now.
Thanks :)

@uklotzde
Copy link
Contributor Author

Already available in master for testing, will be released in 2.3.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants