This week has definitely been interesting in that I have been able to see and better understand the process of how we vee websites. I spent a lot of my time experimenting with wireshark and trying to better understand what all of the information means. The first thing that immediately stood out to me was just how much information was being processed at the same time. In order to bring all the information down to manageable levels, I had to disconnect a lot of my devices from and use one tab at a time on my browser.

Another thing I thought was interesting was how information is broken down and transferred between different sources based on which route is best for each bit of information. Looking into this a little more, I found that the main purpose of this is to make the transfer of data not only more efficient but more secure. For example, if data transfers were directly from the host to the client, if anything were to happen to the connection between the two, it would take much longer if not be impossible for the user to get that information.

While I was researching the topic of packets, another thing that came up was the issue of packet loss. This is the disruption of packets as before they reach their target either causing the information to not be read correctly or not be received at all. This is a major cause of things like lag when loading a web page or pages only partially loading. I found the number of possible causes pretty amazing; ranging from simple things like having walls between you and your wifi signal to legitimate hardware and network issues that can be extremely hard to pinpoint and solve.