Friendly debate: What do we actually know about networking?

I’ll start by saying this is meant as a friendly debate… just to stir the pot without hopefully anyone being offended. All thoughts are welcomed whether you’re an expert in the field or not; I’m absolutely NOT an expert and therefore apologize for ignorance in advance.

As I read various forums about audiophile networking, there are all these posters who are IT professionals claiming it is impossible to have such sonic impacts unless there are significant signal dropouts, etc. I find it rather unsettling that folks are biased by their careers – 95%+ of all the “IT folks” (large umbrella term for so many different roles/jobs/industries) that I know are experts in their specific application but almost all of them either use commercially-available components/software or contract out or build custom software or systems. I don’t know any IT professionals who actually has done R&D on or performed quantitative chip-level design on his/her own switch, networking circuit boards or transponders/receivers, or cable. And even then, it’s only on software systems or specific to their application, and not on an actual streaming audio.

My opinion on the above is that the IT professional’s career, while probably highly successful in their field, is not directly related to audio streaming, but yet they are assuming it’s directly translatable. That’s no different than if an Airbus/Boeing product manager for 30 years claims expertise on a NASA/ESA Mars lander without actual direct experience. I get it that folks can have opinions and best-guesses. But to claim expertise and ground-truth is a different matter.

Looking forward to your thoughts.

/flame suit on/

3 Likes

Here’s an article for funsies:

Similarly, I wish I could record the actual raw digital stream (voltage trace) coming out of my streamer’s SPDIF cable and capture what my DAC sees if I played a 1 minute Qobuz clip. Then, change the ethernet cable / switch / PSU / router etc upstream of my streamer, and then re-capture the data and subtract the difference. If the time-aligned signals coming out of my streamer are identical (or different), isn’t this the definitive test?

3 Likes

That it’s a bunch of black magic that shouldn’t matter, but does. :joy:

3 Likes

This is a good topic as I like you am not an expert in this field and while anecdotal information is shared that focusing on this is a positive outcome, I remain skeptical as it is something I am unable wrap my head around the idea of addressing networking upgrades WILL relate to an improvement.

There is so much interference in the house these days be from other signal emitting sources and as basic as bad electrical grids. How does one measure the improvement in a quantitative rather than qualitative manner?

To be able to do that, you would need an impossible amount of knowledge. Just to name the fields involved in that endeavour:

  • Physics
  • Chemistry
  • Electrical Engineering
  • All the highly specialised manufacturing fields that I am not even aware off

When Device-A says it complies to RFC’s and IEEE-, ISO, DIN, RoHS, CE, etc. standards, then it is perfectly reasonable to assume that hundreds of highly skilled people and experts in their respective niche fields have done their job to satisfy those standards and regulations.


To put this bluntly:
When you buy a thing with an RJ45 jack that says “Network” next to it, do you pull out the multi-thousand dollar oscilloscope to verify that the rise and fall times satisfy those specified in IEEE 802.3ab or do you just plug the thing in to see if the lights you expect to be green and amber light up?

5 Likes

100% agree. Everything’s an onion, the layers are endless.

Regarding your hypothetical question about RJ45 jack and IEEE standards, if I were using it for networking, then no, I’d assume part-part variation but that the engineering + software correction is so robust that I don’t worry if it’s a $0.01 part from Baidu or a $40 Telegartner plug.

But if I claimed myself to be a subject matter expert and how it can’t possibly affect an audio signal, then I should do due diligence, and that’s when I’d consider doing measurements. I’d look at the variations in metallurgy and wire manufacturing, the workflow on the crimping and how the pins are extruded and how the plastic is molded, the sourcing and variations in that, and figure out if I need to come up with a better testing simulation of the real audio signal. Because I had to do that before on a failure analysis of a device that killed a patient, and it took 3 years and $40M (in that case it was required to file w/ the FDA and convince them why the design change mitigated the risk). And even then, I’d say I’m still not an expert…

1 Like

My problem with the whole “debate” is that even a network guy isn’t qualified to respond based on their years of experience because a network relies heavily on robust protocols with error correction but it’s built for DATA and the time domain isn’t as critical when transferring said data.

My issue with all of the “rebuttal” of bits are bits is that even if they acknowledge sonic differences they then go on to say that it’s a cable not in spec and they simply won’t accept a sonic change (good or bad) as possible because the net goal post movement is that you can’t AB something correctly 100% and your memory and hearing is imperfect.

I’ve worked with in data networking going back to analogue voice networks and have experience in LAN/WAN and various over the air TCPIP traffic going back to 2.5 G networks. So by no means an expert but I do know what I’m talking about on the network side as well.

I changed a fiber optic cable a couple of months ago and heard a difference. Why bother to even try to explain that to someone that will either not accept that I hear what I hear or will move the goal posts were I to provide some information on why the time domain and processing and EMI and RFI matter.

4 Likes

The problem is that a lot of audiophiles assume a mechanism for the effect that audiophile network equipment has (same’s true of d2d boxes for what it’s worth), and those mechanisms literally cannot have the stated effect, at least not in a simple direct fashion.

Unless your home network is broken, you will never see a lost packet between machines on the network. I know thins because I’ve run hundreds of tests for network games using various mechanisms, and without a simulator to introduce bad/dropped packets, on a LAN you just never see lost packets.
So the whole it’s lossy transmission thing is out, even without getting into TCP/IP having error correction and loss recovery.

It can’t be related to “clocking”, because the protocols are packet based and there is no audio clock in the signal, you’re literally just doing the same thing as when you download a file, or copy one. So the concept of a “re-clocker” is triggering to people who understand how the transmission works. It does not mean that such a device cannot have an effect, it’s just a bad name for what it’s doing, see next bit.

So what does that leave, and the simple answer is noise on the transmission lines, making it from there and into a device (streamer), at which point that noise can affect Jitter directly (ignoring USB for a minute).

It’s why optical isolation is the “panacea”, electrical noise can’t travel over an optical line, so nothing before the receiver is affecting anything, you’ll still introduce some at the receiver, but that should be limited, though I’ll guess given the heat those things generate, probably more than a simple 100 BaseT network receiver.

Now having said that I could imagine there are other effects, the quality of the transmitted optical signal MIGHT impact how much noise the receiver generates (though I’m doubtful of this). For most of us the network equipment is sharing a connection to the power line being used by the rest of the equipment, again I doubt this is going to make a significant difference. If you do have a network problem and packets are retransmitted, then I could see that impacting noise production by the receiver. Same for how packets are delivered, which has as much to do with the streamer, and the protocol as it does the network itself.

But none of that explains claims about for example network cables, I personally have not noticed a big difference, but if they do exist, they are likely associated with the cables themselves picking up RF noise and adding it into the mix.

8 Likes

Many good things already brought up here and Alpha Audio does a pretty good job on it (this guy is a well-known engineer). Not like Amir from ASR and just looking at a single electronic measurement and claim all others have no glue.

I’m a real IT guy from the 90’s and have worked for large telco and ISP and have done a lot in networking designs :wink: Even installed the first ATM core backbone in DE…okay enough.

So I got for sure convinced, that audiophile networking stack must be looked from an different angle! OSI-Layer and TCP/IP for sure wil not be any difference!!

Here comes my but(s):

  • a decent network switch is sufficient for 99% of audiophile streaming cases
  • yes, i’m also using an OXCO optimized and multi-stacked version from Paul Pang
  • does it make an audible difference? NO (okay, I dont use crappy WLAN in between)
  • EVERYTHING (magic) happens in the streaming transport protocol, DAC module + Clock’s + digital filters, analogue output stage AND operating system in use!!

I’ve seen that now in my Ian Canada DIY highend Streamer+DAC+OPA build.
Tested and tuned several ^audiophile OS’s and ended up with gentooplayer+optimized kernel for DAC/FifoPiQ7 anf put it in RAM… blows away most comnercial streamer (mostly way overloaded, minor optimized…even Volumio is badly implemented).

Dont forget, in the background 98% of software in audio streamer today is based on known Open Source NOT any fancy audio application development (besides fancy UI!!).

My personal conclusion: Software side is the most critical in Streaming and of course DAC + Master Clock (huge impact) analogue output stage!

7 Likes

Different topic, but I entirely agree, almost no manufacturer writes the software in the streamer, they almost all used a paired down linux distribution with open source software to deal with the streaming.

Linux is FAR from a good OS for a streamer, but it just isn’t practical for most manufacturers to try and support multiple protocols on an RTOS.

Vincent at totalDAC will tell you using the Roon only build of the streamer software sounds better (and it might). Than the one with support for all the other formats.

The Taiko interestingly is Windows based not Linux based.

A Raspberry Pi is total overkill for streaming audio if the software was dedicated to the task.
It’s been on my list of things to do to build the simplest streamer I can using the lowest powered uController, that will work. I just can’t find a good USB stack that supports Audio to simplify driving a DAC.

6 Likes

Because sharing knowledge is enlightenment? I admit I don’t quite get the time domain aspect, especially if that is what the dac or streamer is handling. EMI and RFI are sneaky sunnabitches! They find ways around the road blocks!

2 Likes

It’s not the time domain on Ethernet, it’s the time domain on the components that have to work to turn that Ethernet signal into USB, AES or SPDIF. This is the best educated guess I can offer based on my experience in listening and what I know about networking. Networking works but audio the less work a streamer and or a DAC or any processor in chain have to do the less they have to work because that work causes noise. Whether it’s USB cleaning up a crap signal or whatever portion of the streamer takes the network information stream and converts it to audio. I2s.

Again only best guess on my part because optical doesn’t carry network noise yet it can be improved. A better Toslink cable can improve the sound and that’s supposed to be completely noise isolated. It’s just difficult to explain something where I don’t have the umpteen required degrees to understand the physics the electronic engineering and the various protocols and whatever else.

I listen, I know it makes a difference, (whether for better or worse) I’m good with that. And we lived happily ever after.

4 Likes

But that doesn’t hold water either because streamer buffers the incoming stream, using uPnP/DLNA one of my Pi streamers is hardwired and will consume an entire songs worth of audio in <10 seconds, it’s just playing from memory after that, no ethernet involved.
I would never tell anyone what they hear, and I have no idea how an optical ethernet cable could affect sound, but time domain of the signal on the ethernet cable just doesn’t make sense. There has to be some other effect occurring. If you changed from Single to Multimode I could easily believe it.
Similarly if the cable is actually resulting in corrected errors I could see it, but IME they just don’t happen unless you have a bad cable.

Yes but that’s different the toslink cable carries the clock and there will be jitter introduced during conversion to and from optical. In the case of ethernet the clock is injected by the streamer.

3 Likes

I get that you hear it, ergo it does something positive. I am trying to get at the why, as in why one optical network system works better than another?
For example, why not use the lower cost SFP boxes instead of the pricey model sold by Uptone? These pieces or rather the components that create the environment for noise to be eliminated don’t seem to be discussed enough to explain why one should fork their hard earned cash for the next upgrade. Hope that helps provide a background on where I’m coming from, as there are those of us with no IT background at all.

I just get out and meet people. Have friends introduce you to others who can provide opportunities and reciprocate when possible. Networking isnt that hard.

11 Likes

:point_up::point_up::point_up:
:rofl::joy::rofl:

1 Like

As someone who worked as a network professional for several years and maintained a CCNP certification for that duration, I completely disregard the notion that anything you do to a home network can make any difference, much less any difference that is audible. (That’s given you’re not having catastrophic packet loss in your WiFi or Ethernet network due to some faulty component). Network protocols are so incredibly robust and provide orders of magnitude more bandwidth than audio streaming demands that it’s the absolute last place anyone should spend any part of an audiophile budget.

Spend the money on room treatments instead. If you have money left over, spend more of it on room treatments, anyway.

3 Likes

Thanks for posting. I have seen many experienced audiophiles share the same perspective as you, so if it’s okay I’d like to ask why then some other experienced audiophiles do feel like “everything matters” including the routers & switches & their power supplies, ethernet cables, etc… is it:

  1. There’s actually no possible actual difference, and that the perceived differences are all internal functions of the brain? If so, that’s a LOT of highly educated and perfectionist folks who are imagining improvements…
  2. There’s actually a difference which is perceptible to listeners, but we are limited by our measurement techniques and haven’t found it?

Sounds like there’s a market for a blind A/B box that allows psychoacoustic tracking of preference for various cables/signals…

:slight_smile:

2 Likes

and the issue there is that blind A/B back to back is by a long way not the best way to hear these nuances because what we’re hearing is often spacial queues and dimentionality and dynamics. It’s not like gains made in improving Ethernet are anywhere near the top for areas of improvement unless you’re injecting unintended noise somewhere but there are gains to be had.

But it also doesn’t mean that they’re not there. Where they come from is the only thing that’s unknown to me but ask anyone that goes from a Sonore ultraRendu to an opticalRendu and they’ll tell you of the gains they hear and it’s not subtle. :slight_smile:

2 Likes

They are also exaggerated in this community :slight_smile:

5 Likes