-
Notifications
You must be signed in to change notification settings - Fork 17
audio cuts out randomly while streaming from spotify #40
Comments
Same here, audio cuts out randomly. Using volumio 1.5 on a raspberry pi 2 using hifiberry dac+. |
Since there is no response on this I made some free time and looked into the issue myself. I have solved it for my setup...I will share my solution. I installed monitoring tools on my pi to see the disk io (to microsd) and I saw it spiking alot when the audio cuts out. During regular mpd usage this does not happen (also no audio cut outs). So to verify my idea I mounted a cifs share on the pi. The share is served by my windows hp microserver. Then I configured the cache path of the spopd daemon to use the share as cache. I hope this helps you out to. If you need any help on how to do this on volumio just ask me here and Ill help you out. I work as a it administrator over linux servers so for me this is very basic stuff. |
Neat. My music is mounted over NFS anyways so I'll try saving the cache
|
The builtin function from volumio mounts the shares as a ro (read only), you can change this in the advanced settings of your library in the volumio gui. Just change ro to rw. |
Where are you changing cache settings? My changes to For others stuck in a similar situation, note that when you start spop over SSH it'll put its cache in |
Run a ps -ef | grep spopd Edit that config file and then kill spopd and launch it again using that -c Symlinking will still use your sd which will not fix it. I did all this under root account.
|
|
You restart it using that file as well? Else check the spopd log for an error in your config file. I also only used
|
Hi all, First, well done. I wasn't able to reproduce this issue on my laptop (no RPi yet) so I could only suspect that it was a libspotify issue, which doesn't help much. Thanks a lot for your research @Du7chManiac, it explains a lot: slow IO cause libspotify to delay calls to the audio delivery callback, which causes audio buffer underruns. @tobiasmcnulty, if spopd doesn't pick your config changes, you can try quitting it from the JSON interface: just Meanwhile I'll try to reproduce this issue and add a workaround in spop. Thanks a lot for your messages @Du7chManiac and @tobiasmcnulty! |
@Schnouki No problem for helping you out, basically I'm just helping someone who helped me out by delivering this daemon :) In volumio spop is configured without a cache, the cache_path is not in the config. Using it that way is what is causing me the stuttering. It somehow does heavy IO without the cache config. But I'm not involved in developing Volumio so I'm just stating what I see from looking at the config files. Volumio also makes a ram filesystem but it is very small, I think only 256 MB. I think though that another solution would be to use the spop interface and set the playlists to offline mode. Maybe that will soften the IO load as well. I'm sticking with my current setup, I haven't suffered a single stuttering since I configured the cache. PS: On volumio you control spop the same way, just telnet to localhost on 6602 |
Update: Skip this and read my next post below I had the same problems, using Volumio 1.5. I solved it by limiting the cache size to 20MB and moving it to RAM. It's a no cost solution and has worked fine for a few days now. #First limit cache size to 20MB #Turn off the spotify client to avoid problems during changes #check that /.cache is not used by any other program. The only thing listed should be "spop" #Remove the old cache #Create symlink to ramdisk instead #Restart Volumio |
keab solution working for me so far(along with preventing nginx from logging)- thanks! |
This works for me too! Note on my system (Volumio 1.55), the Ramdisk is mounted at Also, I think the last command should read Thanks for the fix! |
One more thing: I think the |
I confirm that this fix works perfectly. Thanks keab. @tobiasmcnulty: correct |
Fix working here as well, thanks @keab. |
Made an account to say thanks to everyone who found a solution. @keab, your solution also works fine here (rpi 1B with hifiberry) |
Great that so many were helped. However I keep learning, so here's an update which simplifies a lot: The symlink is not needed! Just do the following: Regarding the cache_size, I have made some experiments and spop seems to ignore the setting, even if I put it under [spop]. Leaving it out completely should limit the cache to 10% of available disk space (See https://github.com/Schnouki/spop/blob/master/spopd.conf.sample) but spop did not stop at 10%. So if anyone found a way to limit the cache that actually works, please post it! I'm not sure what will happen if/when the ramdisk becomes full but hopefully it will just delete the oldest files and continue. |
I'm running
spopd
on volumio 1.5 on a Raspberry PI, and for the most part it plays fine. However, several times per song (say 2-5 on average), the sound will go silent for a variable period of time (from less than a second to 2-3 seconds). The intervals between the moments of silence is also inconsistent (anywhere from 20-30 seconds to a couple minutes). I've watched the system logs anddmesg
for any events that might be happening around the same time, andtop
for any other processes that might be hogging the CPU, but I don't see anything. The moments of silence also don't seem to correspond to the log entries inspopd.log
for downloading song parts. For example, audio will stop and start right in the middle of the time period between two "download complete" messages, i.e., sound doesn't start up again immediately upon logging of a "download complete":To make things worse, the behavior is the same whether or not
high_bitrate
is set totrue
. The Raspberry PI is not overloaded; spopd is only taking about 7-14% of the CPU, and no other process is taking near that amount. The load average on the machine is around 0.2 - 0.3.Here's a sample log of the "blips" and how they correspond to the
spopd.log
entries: https://docs.google.com/spreadsheets/d/1PL8W1K1uCoqHYxMiykPq9z4FS8vfoIIvjX6z6mTTOJs/edit#gid=0The text was updated successfully, but these errors were encountered: