More Than 100 Cable Types: A Look Back at Forensics Then and Now
When I began working in digital forensics, almost twenty years ago, there were more than 100 types of cable. The bulk of these cables were based on test fixtures, with the rest being proprietary cables from manufacturers at the time. Each manufacturer seemed to have their own connection system, with some manufacturers having different systems depending on which handset was being connected to.
In more recent times, spurred on by developments in the Universal Serial Bus (USB), and in efforts to reduce electronic waste, it has been an effort to standardize a single connection system. It has been seen that this standardization process still allows for proprietary communications protocols, but that there is a reduction in the cables needed for examining handsets in a digital forensic lab (and indeed for extraction in the field) has been reduced over the years. In this document we will look at how cables have changed over the years, and with it, how communication with the devices we use and examine has changed also.
The old days
We begin our journey in pre-USB or very early USB times. Although USB was common on computing systems at the time, it was still not widely adopted on handsets of the time, whether it was too costly or whether there wasn’t a need since devices didn’t store much data, most handsets were using either serial connections or very rudimentary serial over USB connections at the time. There was some intelligence in most cables (by this I mean that there was a CPU of some kind within the cable, and it wasn’t just wires), and this meant that there were different drivers for connecting to each cable. In those days, digital forensics was performed either over the proprietary cable systems or via a test point fixture (looking at you old Nokia).
So, what do we mean by Serial Connections. A serial connection is a very simple way of connecting two devices. At its minimum, it consists of two wires, a transmitting wire and a receiving wire. This makes it both very easy to implement but also it is very cheap to implement. If both ends of the wires are configured with the same parameters, data can be transferred over a serial connection. These parameters include the number of bits in a single transfer (this serial connection can often pre-date modern computing and go back to telegraph equipment) the number of bits that tell the system a transmission has started and the number of bits to tell it the transmission is complete. Finally, there are opportunities to include error correction or detection via a single parity bit.
The parity bit can be optional, but the speed of transmission and the start and stop bits are most often the parameters required for transmission. We can also include some other wires for handshake control, these are Ready To Send (which means that the transmit buffer has data in it and we are signaling that we are ready to send that data) and Clear To Send (which means that the receive buffer is empty and we are ready to receive data) but these are not often used these days since serial communications has become more stable (and the use of virtual serial ports over USB are more common these days).
We can make use of a single direction when using serial communications. This is often called half duplex, and we can also use both transmit and receive lines at the same time, called full duplex. Most communications in this day and age are full duplex communications, but there was a time when half duplex was used (USB 2 uses half duplex communications where USB 3 uses full duplex for example). Using these serial protocols usually means using a proprietary cable, which then connected to the USB port of the computer (we could also use Bluetooth and IR serial protocols in those days as well as USB, but USB has taken over). USB connections were not really widely used on handsets at that time but were mostly available on PCs.
These cables had their own pin definitions but came with a USB serial converter in the cable, so some intelligence was in the cable. This also meant that many cables had their own driver in Windows, and as a tool manufacturer, this could lead to some complexity in being able to support many handsets.
The bulk of the cables in those days were based on Nokia test fixtures. Nokia was a major player in the mobile device market in those days, and probably half the handsets I examined in those days were Nokia handsets. These Nokia test fixtures were connected to some kind of box that was primarily designed to service the handset. As time moved on, more devices were added to boxes, and we ended up with different boxes for different handsets, with their own test fixtures. We should look in more detail at one of these test fixtures. In the first instance, they replaced the battery, since often the test fixture footprint was under the battery.
To service a handset, it had to be placed into “Local Mode” which was achieved by setting a voltage on the battery temperature pin in the battery connector of the handset. The fixtures used a resistor divider to set the correct voltage. The fixtures also had the correct footprint for the test fixture footprint, and the transmit and receive pins were connected, so that serial communications with the device could happen.
Nokia service protocols were dependent on which chipset was in the device. Older chipsets used a protocol called MBUS, which was half duplex at 9600 bits per second. Later devices used FBUS, which was a full duplex at 115200 bits per second. FBUS continued to be used for Nokia devices that had USB connections but are no longer used in these devices since it isn’t the same company that makes them.
The Coming of USB
USB started to become more popular in handsets around the same time smartphones became popular. This makes some kind of sense, since USB 2.0 was at least an order of magnitude faster than USB 1.0, but also smartphones tend to have an order of magnitude more non-volatile memory than older handsets. Some handsets, especially the cheaper handsets, continue to use serial ports and serial USB converters for service modes and general communications with a PC. But most started moving towards USB since that had better speed and reliability (they were probably also cheaper since they didn’t require a converter in the cable).
This made communications with a handset easier, and older handsets had several protocols they could use with USB. However, it also meant that the service modes that we were used for doing the work of digital forensics also changed access methods. Along with this we also saw the emergence of lightning cables for Apple devices, which retained some intelligence since the cable could negotiate a connection scheme.
Lightning cables themselves have been through a few iterations, but the Apple ID chip has remained a part of the genuine cable until recently. This locked down the cable in effect and made it difficult to make cables for Apple devices, since they were always under the threat of being obsolete. The USB cables are then converted into mini and micro-USB cables. This brought us down to communication with the bulk of devices being able to be performed with two or three cables (as a normal user).
Over time, Nokia test fixture cables became obsolete, and the handsets were not made any more, so it made sense to save space and money and remove these cables from a general cable kit. As USB 2.0 matured, we saw the development of USB On the Go (OTG), and an extra pin requirement for ID. Up until USB OTG, the device remained a device and had to connect to a host (mostly a PC as a host), but with USB OTG, the device was able to switch to being a host.
The advantage of this was that a handset could be connected to a keyboard or another device and was able to support many different input methods as well as output methods. This expanded the connector with that extra pin. But we still had a problem with getting to the special modes we used for forensic access to a device.
As it turned out, the extra ID pin was used to help identify a special cable that could be used to place the handset in service mode. This could also be used to set up other modes, which were used in some dongles that were created for special purposes (the Keep Alive Dongle uses the ID pin for its functioning, for example). Other special cables were designed for use with Qualcomm EDL mode, where one of the USB data lines was grounded for a short time at start up to select the right mode.
Onwards to USB 3 and Type C
As time moves on, so does technology, and in this case, the USB specification. USB 3.0 brought faster speeds, but also a different connector set, since super speed lines capable of high speeds (5Gb/s for USB 3 over 480Mb/s for USB 2, over the 12Mb/s for USB1) had to be incorporated. The lower speed USB 2 lines were also kept in case the cable or device couldn’t use the USB 3 speeds (low quality cables are not designed for high speed and may fail negotiation with some devices).
The change to USB 3 also changed how to get to service modes. We see more test point access methods at this time, we can also see the use of different negotiations, since USB introduced a method to change the function of the cable, this was extended a little into USB 3. This isn’t necessarily a cable function, but a function of the protocols that use the cables’ communications functions. As speeds of cables increase, we can place more data on the line at the same time, so we can give some time to different protocols to use at the same time on the same set of data lines.
At this point, we are a very long way away from the serial lines from the original handsets. In more recent times there has been a push to standardize cables so that there is less electronic waste. This extends to charger devices too. This has led to the development of USB Type C. This connector system is dual sided and has extra pins for sideband control circuit usage.
The cable can negotiate which side of the cable it wants to use for USB function and can be used for multiple end user functions all within the same cable (since there are extra pins for other communications usage we can use these to control video or audio over the cable from a different device for example). This now reduces our cable count, since Apple will use these cables as well as Android. In most cases they are assumed to be dumb cables, but special purpose cables can be made with intelligence back in the cable to perform these functions.
Summary
Cable kits for digital forensics have come a long way in the last twenty years. They have shrunk, since cables are more standard, but also the intelligence in the cables means there is a lot of development time spent making cables understand the protocols that exist in the cable.
We also have a cable being the most used part of the forensic system, and with that comes wear, since every time we connect a cable, we degrade it by a very small amount (this is a physics problem in the two surfaces are being moved across each other).
So, if a dump doesn’t work, try to change the cable first. Soon, USB 4 will come out, which will provide us with a new set of challenges (USB 4 is meant to increase speeds to 40 to 80Gb/s depending on version) and a whole new set of protocols and device modes that we will need to understand.
Author
Dave Lauder
Software and Hardware Engineer at MSAB