Skip to content

Clarification Needed: Deepgram Self-Hosted Transcriptions Functioning Without Local Models #995

Discussion options

You must be logged in to vote

@devalbham, that's an interesting observation. Did you also successfully serve both batch and streaming requests before deleting both those models, to ensure the models were loaded in memory before being deleted?

Setting aside the exact mechanism, it sounds like the concern for your clients is that somehow Deepgram is using another cloud transcription endpoint as a fallback, is that right? I'll offer a few ways you can establish confidence that this isn't the case.

First, in the Deepgram Console, you can view Usages -> Logs, enter your request ID, and verify that the request was still Deployment: Self-Hosted.

Second, the only outbound access the containers need is to license.deepgram.com,…

Replies: 4 comments 6 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@devalbham
Comment options

Comment options

You must be logged in to vote
4 replies
@devalbham
Comment options

@jkroll-deepgram
Comment options

@devalbham
Comment options

@jkroll-deepgram
Comment options

Answer selected by devalbham
Comment options

You must be logged in to vote
1 reply
@jkroll-deepgram
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants