-
Notifications
You must be signed in to change notification settings - Fork 31
feat: playback proxy #213
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: playback proxy #213
Conversation
poetry has a very conservative approach to locking versions. Uv seems to prefer specifying quite widely, but then locking in the lockfile.
|
Note that I've also bumped the python version to 3.12 because I wanted inline generics ( |
|
well tentative impl... I'll actually test it when I get back. Bet it doesn't work :D. I'll stop force-pushing all the time when this goes out of draft. |
|
Sorry, haven't yet had time to look at the PR. I was wondering, are there any of these additions that makes better sense to add directly to tidalapi? |
I don't like it, but it's too late to SHOUT at the clouds.
|
NW on the PR at all, it's still been in flight too. Added cache eviction. I don't think any of this belongs in tidalapi so far. Despite the number of commits here I've modeled this as a pure HTTP(S) proxy sitting between tidal and gstreamer. As such it will actually work for any streaming service which uses HTTP. The logic was that there's really no reason tidalapi should handle caching or storing or streaming: these are all things applications will have different ideas about and will need to solve differently. The only thing I do think we need in tidalapi is a way to opt out of MPEG-DASH when using PKCE (if even possible: it might just be that for that you need the new encrypted download endpoints). It might make sense to spin off the whole of The remaining bit of architecture is to cache the ID -> path conversion somewhere, so playback can work via the cache without needing tidalapi. That will allow full offline mode for downloaded content. The obvious solution is just to store an entry in the cache DB when we kick off a download, and then use that to build the cache URL where we can without needing the internet. But I want to think this through, because "just" writing an HTTP proxy was really motivated by getting this up and running as easily as possible. BTW we're now at the point where you can grab the progress bar and drag it around both when proxying and when serving from the cache and it will "just work". At least I can't make it crash any more. Current functionality:
The odd thing is that a "real" db would be much easier to do some of this with (we could just use transactions for rollback...) but sqlite is in python and available everywhere, so we'll use that. (I did consider not using a DB but it ended up much easier this way.) This is a strict superset of the other 2 PRs, so it might make more sense to look at those first when you do get round to it. I'll try to get the uv tidalapi PR in in the next few days. Happy new year! |
|
With this push everything is implemented except the pkce + offline download. I have a POC of the offline download and it's going to be sufficiently complicated I'll land it in a separate PR, either into this branch or upstream. For PKCE I'll wait for your input, but right now we could just document this as only working with oauth and look into it later. Changes:
Naturally you might ask if the cache shouldn't just work with uris all along. I don't think that's a bad way to build it, but I've gone for the other way now :D. I did consider appending a custom http query param like |
|
Drat we do need to add a a new config option for the cache max size and pass it through. |
This PR adds an http proxy for playback suitable for use over slow / unreliable connections. Unlike my previous proxy from years ago, this is a fully-featured (if minimal) http proxy sitting between gstreamer and tidal. It's also unfinished, but I'm opening it now for criticism / suggestions.
It is supposed to work as follows:
Note that caching is only applied when mpeg-DASH is not used, since caching an adaptive stream makes no sense (every chunk in mpeg-DASH is like a little file with one of multiple encodings adapting to network conditions. We could make this work by just retrying until we get the high-quality section, but that really defeats mpeg-DASH. Naturally, this means the cache won't work with formats requiring mpeg-DASH. I don't think that's a problem, as you can't listen to ultrasonic encoded audio over 4G anyway ;).
NOTE this pr is based off the uv PR, so most of the commits currently are from the uv / ruff port. If uv is rejected I'll rebase and get it working with whatever main is, although I hope it isn't as I gave up trying to get poetry to play nicely with gstreamer on nix :D
The work is:
For the fun of it this is a zero-dep pure python impl.