AIS is a highly technical and complex technology that is finely balanced. Each AIS product must ensure it is interoperable within the AIS system, to do this it must work within set criteria laid down by the IMO standards.
AIS is a complex technology which is governed by a strict international technology ‘standard’ which is controlled by the IEC. This ensures that the integrity and operational performance of the AIS system and interoperability between AIS devices is guaranteed.
AIS uses TDMA (Time Division Multiple Access) technology to allocate and share the available airwaves on the AIS frequency. The AIS standard states that there are a fixed number of these slots for each of the two AIS channels: 2,250 on each channel every 60 seconds, so a total of 4,500 every 60 seconds across both channels.
Each time slot allows a specific amount of space for the size of the AIS transmission which in turn is limited by the amount of information a single AIS message carries. The AIS system has been designed to offer local, regional and national level systems, as such the AIS system intelligently adapts itself to the number of vessels within a given area. If the ‘slot map’ were ever to become overloaded, priority to the slots is given to the closest vessels ensuring they are seen first. In practice this means that no matter how many vessels are being tracked a properly configured AIS system will ensure all vessels are seen.
An AIS transponder normally works in an autonomous and continuous mode, regardless of whether it is operating in the open seas or coastal or inland areas. AIS transponders use two different frequencies, VHF maritime channels 87B (161.975 MHz) and 88B (162.025 MHz), and use 9.6 kbit/s Gaussian minimum shift keying (GMSK) modulation over 25 or 12.5 kHz channels using the High-level Data Link Control (HDLC) packet protocol.
Although only one radio channel is necessary, each station transmits and receives over two radio channels to avoid interference problems, and to allow channels to be shifted without communications loss from other ships. The system provides for automatic contention resolution between itself and other stations, and communications integrity is maintained even in overload situations.
AIS data is transmitted using GMSK modulation, filtered to fit within the confines of a 25kHz channel mask (see image below). Data sent out side of this mask is liable to be lost and anot be received by other AIS devices operating within the AIS system. Transmitting data within this mask, ensures the integrity of the AIS system and enasures all data is received.
In figure 1, the data isd sent within the mask (the lines that surround the radio wave. all of this data will be received by other AIS devices.
In figure 2, some of the data is sent outside the mask, which will not be received by other AIS devices in range.
For 'Class B' operation, the channel mask is specified for a slotted transmission burst, and so includes, and accounts for, the effects of transients at the beginning and end of the data burst.
There are 27 different types of top level messages defined in ITU 1371-4 (out of a possibility of 64) that can be sent by AIS transceivers.
A full list can be downloaded from the download section on the right.
AIS messages 6, 8, 25, and 26 provide "Application Specific Messages" (ASM), that allow "competent authorities" to define additional AIS message sub-types. There are addressed (ABM) and broadcast (BBM) variants of the message.
Addressed messages, while containing a destination MMSI, are not private and may be decoded by any receiver. One of the first uses of ASM's was the Saint Lawrence Seaway use of AIS binary messages (message type 8) to provide information about water levels, lock orders, and weather. The Panama Canal uses AIS type 8 messages to provide information about rain along the canal and wind in the locks. In 2010, International Maritime Organization issued Circular 289 that defines the next iteration of ASM's for type 6 and 8 messages.
Error free communications is something every user would like to enjoy. Digital transmission, with its ability to completely avoid cumulative noise-induced degradation, should provide this. One reason for the digital reality not meeting expectations is mis-timing inside transmission equipment when data is regenerated. When mistiming becomes large, errors are produced and the system can become unusable. Even at low values of mis-timing, sensitivity to amplitude and phase variations is increased and performance suffers.
Jitter is always present within devices, systems and networks to a certain degree. In order to ensure interoperability between devices and minimize signal degradation due to jitter accumulation across long distances, it is important that there are limits set on the maximum level of jitter present at an output interface and the minimum level that can be tolerated at an input. Adherence to these limits will ensure interworking between different vendor equipment and networks, as well as providing the basis for demarcation.
Slow variations in signal timing through a system are called wander. Higher speed variations are termed jitter. The division between the two is taken at 10 Hz. Wander is measured using a single pole lowpass filter with its –3 dB point at 10 Hz while jitter uses a high-pass filter with the same –3 dB frequency.
A fundamental operation in every digital transmission system is to receive a degraded signal and regenerate it. All high capacity systems transmit only a suitably coded data signal, and the first task of a regenerator is to produce a local clock signal from that data. There are two contradictory requirements. First, the local clock should be stable for onward transmission and easier aggregation with other data sources. Second, the local clock should track incoming phase
variations of the data signal so that as the optimum sampling point for the input data varies, the clock tracks it. This leads to the danger of phase variations building up as a signal traverses a network and each regenerator in turn attempts to track incoming phase variations.
There are three measurements that define the jitter performance of a transmission system and specifications and standards can be expected to refer to all three:
• Output jitter - a measurement of the jitter present on an output from a system
• Jitter tolerance - a measurement to check the resilience of equipment to input jitter
• Jitter transfer - a measure of how much jitter is transferred between input and output of network equipment
Source: Agilent Technologies