HD-MAC was a proposed broadcast television systems standard by the European Commission in 1986 (MAC standard), a part of Eureka 95 project. It is an early attempt by the EEC to provide High-definition television (HDTV) in Europe. It is a complex mix of analogue signal (Multiplexed Analogue Components), multiplexed with digital sound, and assistance data for decoding (DATV). The video signal (1250 (1152 visible) lines/50 fields per second in 16:9 aspect ratio) was encoded with a modified D2-MAC encoder.
HD-MAC could be decoded by standard D2-MAC receivers (SDTV), but in that mode only 625 (576) lines and certain artifacts were visible. To decode the signal in full resolution required a specific HD-MAC tuner.
Siri Remote is compatible with Apple TV 4K and Apple TV HD. Requires HDCP when playing protected content. Supports up to 4K HDR or Dolby Vision TVs. Services are not available in all countries or regions and are subject to change. † Apple TV+ is $4.99/month. On your Mac, select a disk, folder, or file, then choose File Get Info. If the information in Sharing & Permissions isn’t visible, click the disclosure triangle. Click the lock icon to unlock it. Enter an administrator name and password. Click a user or group in the Name column, then choose a privilege setting from the pop-up menu.
The European Broadcasting Union video format description is as follows: width x height [scan type: i or p] / number of full frames per second
As an example, the 1280×720p/60 format provides sixty 1280x720 pixel progressively scanned pictures each second. Lines are transmitted in the natural sequence: 1, 2, 3, 4, and so on.
European standard definition broadcasts use 720×576i/25, meaning 25 720 pixels wide and 576 pixels high interlaced frames: odd lines (1, 3, 5 ..) are grouped to build the odd field, which is transmitted first, then it is followed by the even field containing lines 2, 4, 6.. Thus, there are two fields in a frame, resulting in a field frequency of 25 × 2 = 50 Hz.
The visible part of the video signal provided by an HD-MAC receiver was 1152i/25, which exactly doubles the vertical resolution of standard definition. The amount of information is multiplied by 4, considering the encoder started its operations from a 1440x1152i/25 sampling grid.
Work on HD-MAC specification started officially in May 1986. The purpose was to react against a Japanese proposal, supported by the US, which aimed to establish the NHK-designed system as a world standard. Besides preservation of the European electronic industry, there was also a need to produce a standard that would be compliant with the 50 Hz field frequency systems (used by a large majority of countries in the world). Truth be said, the precisely 60 Hz of the Japanese proposal was also worrying the US, as their NTSCM-based standard definition infrastructure used a practical frequency of 59.94 Hz. This apparently minor difference had the potential for a lot of trouble.
In September, 1988, the Japanese performed the first High Definition broadcasts of the Olympic games, using their Hi-Vision system (NHK produced material using this format since 1982). In that same month of September, Europe showed for the first time a credible alternative, namely a complete HD-MAC broadcasting chain, at IBC 88 in Brighton. This show included the first progressive scan HD video camera prototypes (Thomson/LER).
For the Albertville 1992 Winter Olympics and Barcelona 1992 Summer Olympics, a public demonstration of HD-MAC broadcasting took place. 60 HD-MAC receivers for the Albertville games and 700 for the Barcelona games were set up in 'Eurosites' to show the capabilities of the standard. 1250 lines (1152 visible) CRT video projectors were used to create an image a few meters wide. There were some Thomson 'Space system' 16:9 CRT TV sets as well. The project sometimes used rear-projection televisions. In addition, some 80,000 viewers of D2-MAC receivers were also able to watch the channel (though not in HD). It is estimated that 350,000 people across Europe were able to see this demonstration of European HDTV. This project was financed by the EEC. The PAL-converted signal was used by mainstream broadcasters such as SWR, BR and 3sat. The HD-MAC standard was also demonstrated at Seville Expo '92, exclusively using equipment designed to work with the standard such as Plumbicon and CCD cameras, direct view and rear projection CRT TVs, BCH-1000 Type B VTRs, single mode fiber optic cables, and Laserdisc players with their respective discs. Production equipment was visible to the public through windows.
Because UHF spare bandwidth was very scarce, HD-MAC was usable 'de facto' only to cable and satellite providers, where their bandwidth was less constricted, similarly to Hi-Vision that was only broadcast by the NHK through a dedicated satellite channel called BShi. However, the standard never became popular among broadcasters. For all this, analogue HDTV could not replace conventional SDTV (terrestrial) PAL/SECAM, making HD-MAC sets unattractive to potential consumers.
It was required that all high-powered satellite broadcasters use MAC from 1986. However, the launch of middle-powered satellites by SES and the use of PAL allowed broadcasters to bypass HD-MAC, reducing their transmission costs. HD-MAC (the high-definition variant of MAC) was left for transcontinental satellite links, however.
The HD-MAC standard was abandoned in 1993, and since then all EU and EBU efforts have focused on the DVB system (Digital Video Broadcasting), which allows both SDTV and HDTV.
This article about IFA 1993 provides a view of the project's status close to its end. It mentions 'a special BBC compilation encoded in HD-MAC and replayed from a D1 Video Tape Recorder'. Most probably it was a BRR instead, a device that used more modern digital compression to fit HD on a single D-1, and bore no relationship to HD-MAC.
HD-MAC development was stopped alongside the EUREKA project in 1996, because picture quality was not deemed to be good enough, receiving TVs didn't have enough resolution, the 16:9 aspect ratio that would later become standard was seen as exotic, and receiving TVs weren't large enough, and those that were, were CRT TVs which made them extremely heavy.
PAL/SECAM analogue SDTV broadcasts use 6-, 7- (VHF), or 8 MHz (UHF). The 819-line (system E) used 14 MHz wide VHF channels. For HD-MAC, the transmission medium must guarantee a baseband bandwidth of at least 11.14 MHz. This translates to a 12 MHz channel spacing in cable networks. The specification allows for 8 MHz channels, but in this case assistance data can no longer be correctly decoded, and it is only possible to extract a standard definition signal, using a D2-MAC receiver. For satellite broadcasting, due to FM modulation spectrum expansion, an entire satellite transponder would be used, resulting in 27 to 36 MHz of bandwidth. The situation is pretty much the same in analogue standard definition : a given transponder can only support one analogue channel. So from this point of view, going to HD does not represent an inconvenience.
BRE (Bandwidth Reduction Encoding) operation started with analogue HD video (even when the source was a digital recorder, it was reconverted to analogue to feed the encoder). It was specified to have a 50 Hz field frequency. It could be interlaced, with 25 frames a second (called 1250/50/2 in the recommendation), or it could be progressively scanned with 50 full frames a second (called 1250/50/1). The interlaced version was the one used in practice. In any case, the number of visible lines was 1152, twice the standard 576 lines vertical definition. The full number of lines in a frame period, included those that cannot be displayed, was 1250. This made for a 32 µs line period. According to ITU recommendation for HDTV standards parameters the active part of the line was 26.67 µs long (see also the LDK 9000 camera document ).
Had the modern trend for square pixels applied, this would have yielded a 2048x1152 sampling grid. There was no such requirement in the standard, though, since CRT monitors don't need any extra scaling to be able to show non-square pixels. According to the specification, the sampling rate for the interlaced input to use was 72 MHz, resulting in 72 x 26.67 = 1920 horizontal samples. It was then reconverted to 1440 from within the sampled domain. The input signal often originated from sources previously sampled at only 54 MHz, for economical reasons, and therefore already containing no more than the analogue equivalent of 1440 samples per line.Anyway, the starting point for BRE was a 1440x1152 sampling grid (twice the horizontal and vertical resolutions of digital SD), interlaced, at 25 fps.
To improve horizontal resolution of the D2-MAC norm, only its bandwidth had to be increased. This was easily done as, unlike PAL, the sound is not sent on a sub-carrier, but multiplexed with the picture. However, to increase vertical bandwidth was more complex, as the line frequency had to stay at 15.625 kHz to be compatible with D2-MAC. This offered three choices:
As none of the three modes would have been sufficient, the choice during encoding was not made for the whole picture, but for little blocks of 16×16 pixels. The signal then contained hints (the DATV digital stream) that controlled which de-interlacing method the decoder should use.
The 20 ms mode offered an improved temporal resolution, but the 80 ms was the only one that provided High spatial definition in the usual sense. The 40 ms mode threw away one the HD fields and reconstructed it in the receiver with the assistance of motion compensation data. Some indications were also provided in case of a whole frame movement (camera panning,.) to improve the quality of the reconstruction.
The encoder could work in 'Camera' operating mode, using the three coding modes, but also in 'film' mode where the 20 ms coding mode was not used.
The 80 ms mode took advantage of its reduced 12.5 fps frame rate to spread the contents of an HD frame over two SD frames, meaning four 20 ms fields = 80 ms, hence the name.
But that was not enough, as a single HD frame contains the equivalent of 4 SD frames. This could have been 'solved' by doubling the bandwidth of the D2-MAC signal, thus increasing the allowed horizontal resolution by the same factor. Instead, the standard D2-MAC channel bandwidth was preserved, and one pixel out of two was dropped from each line. This sub-sampling was done in a quincux pattern. Assuming pixels on a line independently numbered from 1 to 1440, only pixels 1,3,5.. were retained from the first line, pixels 2, 4, 6.. from the second, 1, 3, 5..again from the third, and so on. That way, information from all the columns of the HD frame were conveyed to the receiver. Each missing pixel was surrounded by 4 transmitted ones (except on the sides) and could be interpolated from them. The resulting 720 horizontal resolution was further truncated to the 697 samples per line limit of the D2-HDMAC video multiplex.
As a consequence of those operations, a 4:1 reduction factor was achieved, allowing the high definition video signal to be transported in a standard D2-MAC channel. The samples retained by the BRE were assembled into a valid standard definition D2-MAC vision signal and finally converted to analogue for transmission. The modulation parameters were such that the independence of the samples was preserved.
To fully decode the picture, the receiver had to sample the signal again and then read from the memory several times. The BRD (Bandwidth Restoration Decoder) in the receiver would then reconstruct a 1394x1152 sampling grid from it, under the control of the DATV stream, to be fed into its DAC.
The final output was a 1250 (1152 visible) lines, 25 fps, interlaced, analogue HD video signal, with a 50 Hz field frequency.
European systems are generally referred to as 50 Hz standards (field frequency). The two fields are 20 ms apart in time. The Eu95 project stated it would evolve towards 1152p/50, and it is taken into account as a possible source in the D2-HDMAC specification. In that format, a full frame is captured every 20 ms, thus preserving the quality of motion of television and topping it with solid artifact-free frames representing only one instant in time, as is done for cinema. The 24 fps frame frequency of cinema is a bit low, though, and a generous amount of motion smear is required to allow the eye to perceive a smooth motion. 50 Hz is more than twice that rate, and the motion smear can be reduced in proportion, allowing for sharper pictures.
In practice, 50P was not used very much. Some tests were even done by having film shot at 50 fps and subsequently telecined.
Thomson / LER presented a progressive camera. However, it used a form of quincunx sampling and had therefore some bandwidth constraints.
This requirement meant pushing the technology boundaries of the time, and would have added to the notorious lack of sensitivity of some Eu 95 cameras (particularly CRT ones). This thirst for light was one of the problems that plagued the operators shooting the French film 'L'affaire Seznec (The Seznec case)' in 1250i.Some CCD cameras were developed in the context of the project, see for example LDK9000 : 50 DB signal to noise ratio at 30 MHz, 1000 lux at F/4.
The Eu95 system would have provided better compatibility with cinema technology than its competitor, first because of progressive scanning, and second because of the convenience and quality of transfer between 50 Hz standards and film (no motion artifacts, one just needs to invert the usual 'PAL speed-up' process by slowing down the frame rate in a 25/24 ratio). Taking one frame out of two from a 50P stream would have provided a suitable 25P video as a starting point for this operation. If the sequence is shot at 50 P with a fully opened shutter, it will produce the same amount of motion smear as a 25P shot with a half opened shutter, a common setting when shooting with a standard movie camera.
In practice, Hi-Vision seems to have been more successful in that regard, having been used for films such as Giulia e Giulia(1987) and Prospero's books(1991).
A consumer tape recorder prototype was presented in 1988. It had an 80-minute recording time and used a 1.25 cm 'metal' tape. Bandwidth was 10.125 MHz and signal to noise ratio 42 dB.
An HD-MAC videodisc prototype had been designed as well. The version that was presented in 1988 could record 20 mn per side of a 30 cm disc. Bandwidth was 12 MHz and S/N 32 dB. This media was used for several hours at Expo 92.
On the studio and production side, it was entirely different. HD-MAC bandwidth reduction techniques bring the HD pixel rate down to the level of SD. So in theory, it would have been possible to use an SD digital video recorder, assuming it provides enough room for the DATV assistance stream, which requires less than 1.1 Mbit/s. SD video using 4:2:0 format (12 bits per pixel) needs 720x576x25x12 bits per second, which is slightly less than 125 Mbit/s, to be compared with the 270 Mbit/s available from a D-1 machine.
But there is no real reason for the studio equipment to be constrained by HD-MAC, as the latter is only a transmission standard, used to convey the HD material from the transmitter to the viewers. Furthermore, technical and financial resources are available to store the HD video with better quality, for editing and archiving.
So in practice, other methods were used. At the start of the Eureka95 project the only means of recording the HD signal from a camera was on a massive 1-inch reel-to-reel tape machine, the BTS BCH 1000, which was based on the Type B videotape format but with 8 video heads instead of the two normally used, thus accommodating the higher bandwidth requirements of HD-MAC.
The plan within the Eureka95 project was to develop an uncompressed 72 MHz sampling digital recorder, dubbed the 'Gigabit' recorder. It was expected to take a year to develop, so in the interim, two alternative digital recording systems were assembled, both using the standard definition 'D1' uncompressed digital component recorder as starting points.
The Quincunx-subsampled, or double/dual D1 system developed by Thomson used two D-1 digital recorders which were synchronized in a master/slave relationship. Odd fields could then be recorded on one of the D-1 and even fields on the other. Horizontally the system recorded just half the horizontal bandwidth, with samples taken in a quincunx sampling grid. This gave the system a full bandwidth performance in the diagonal direction, but halved horizontally or vertically depending on the exact image temporal-spatial characteristics.
The Quadriga  system was developed by the BBC in 1988 using 4 synchronised D1 recorders, 54 MHz sampling, and distributed the signal in such a way that blocks of 4 pixels were sent to each recorder in turn. Thus if a single tape was viewed, the image would appear as a fair but distorted representation of the whole image, enabling edit decisions to be taken on a single recording, and a three-machine edit was possible on a single quadriga by processing each of the four channels in turn, with identical edits made on the other three channels subsequently under the control of a programmed edit controller.
The original D1 recorders were restricted to a parallel video interface with very bulky short cables, but this was not a problem, since the digital signals were contained with the 5 half-height racks (4 D1s and the interface/control/interleaving rack) which made up the Quadriga, and initially all external signals were analogue components. The introduction of SDI (the 270Mbit/s Serial Digital Interface) simplified cabling by the time the BBC constructed a second Quadriga.
Philips also constructed a Quadriga but used a slightly different format, with the HD image divided into four quadrants, each quadrant going to one of the four recorders. Excepting a slightly longer processing delay, it otherwise worked similarly to the BBC approach, and both versions of the Quadriga equipment were made to be interoperable, switchable between interleaved and quadrant modes.
In about 1993 Philips, in a joint venture with Bosch (BTS), produced a 'BRR' (or Bit Rate Reduction) recording system to enable the full HD signal to be recorded onto a single D1 (or D5) recorder. A low-resolution version of the image could be viewed in the centre of the screen if the tape was replayed on a conventional D1 recorder, and was surrounded by what appeared to be noise, but was in fact simply coded/compressed data, in a similar way to later MPEG digital compression techniques, with a compression rate of 5:1, starting with 72 MHz sampling. Some BRR equipment also contained Quadriga interfaces, for ease of conversion between recording formats, also being switchable between BBC and Philips versions of the Quadriga format. By this time, Quadriga signals were being carried on four SDI cables.
Finally, with help from Toshiba, in around 2000, the Gigabit recorder, by now known as the D6 'Voodoo', was produced, some years after work on the 1250-line system had ceased in favour of the Common Image Format, the HDTV system as it is known today.
Opendcp for mac. Hence the quality of Eureka 95 archives is higher than what viewers could see at the output of an HD-MAC decoder.
For the making of the HD-based movie L'affaire Seznec, the Thomson company certified it would be able to transfer HD to 35 mm film. But none of the attempts were successful (shooting was done on dual-D1).However, another French movie shot in 1994, Du fond du coeur: Germaine et Benjamin, allegedly achieved such a transfer. It is said to have been shot in digital high definition, in 1250 lines.If so, it would arguably be the first digital high definition movie, using a film-friendly 50 Hz field rate, 7 years before Vidocq and 8 years before Star Wars: Episode II – Attack of the Clones.. For a historical perspective on HD-originated movies, one can mention early attempts such as 'Harlow', shot in 1965 using a near-HD analogue 819 lines process that later evolved to higher resolutions (see Electronovision).
Experience was gained on important building blocks like HD digital recording, digital processing including motion compensation, HD CCD cameras, and also in factors driving acceptance or rejection of a new format by the professionals, and all of that was put to good use in the subsequent Digital Video Broadcasting project which, in contrast to HD-MAC, is a great worldwide success. Despite early claims by competitors that it could not do HD, it was soon deployed in Australia for just that purpose.
The cameras and tape recorders were reused for early experiments in digital high definition cinema.
The US brought home some of the Eu95 cameras to be studied in the context of their own HDTV standard development effort.
In France, a company called VTHR (Video Transmission Haute Resolution) used the Eu95 hardware for some time to retransmit cultural events to small villages (later, they switched to upscaled 15 Mbit/s MPEG2 SD).
In 1993, Texas Instruments built a 2048x1152 DMD prototype. No rationale is exposed in the papers for choosing this specific resolution over the Japanese 1035 active lines system, or alternatively doubling the 480 lines of the standard US TV to 960, but that way it could cover all resolutions expected to be present on the market, and that included the European one, which happened to be the highest. Some legacy of this development may be seen in '2K' and '4K' digital movie projectors using TI DLP chips, which run a slightly wider than usual 2048x1080 or 4096x2160 resolution, giving 1.896:1 aspect ratio without anamorphic stretching (vs the 1.778:1 of regular 16:9, with 1920 or 3840 horizontal pixels), give a little (6.7%) more horizontal resolution with anamorphic lenses when showing 2.21:1 (or wider) movies specifically prepared for them, and further enhancement (~13.78%) through reduced letterboxing if used without such lenses.
As of 2010, some computer monitors with 2048x1152 resolution were available (e.g. Samsung 2343BWX 23, Dell SP2309W). This unlikely to be in reference to Eu95, especially as the refresh rate will generally default to '60 Hz' (or 59.94 Hz), but simply a convenient 'HD+' resolution made for bragging rights over ubiquitous 1920x1080 HD panels, with the slimmest possible actual resolution improvement whilst keeping the same 16:9 resolution for video playback without cropping or letterboxing (the next nearest 'convenient' 16:9 resolution being the comparatively much larger, so much more expensive 2560x1600 '2.5K' as used in e.g. Apple Cinema and Retina displays); it is also a 'neat' power-of-2 width, twice the width of one-time standard XGA (so, e.g. websites designed for that width can be smoothly zoomed to 200%), and happens to be 4x the size of the 1024x576 panels commonly used for cheaper netbooks and mobile tablets (much as the 2.5K standard is 4x the 1280x800 WXGA used in ultraportable laptops and midrange tablets). In this way, it can be considered a form of convergent specification evolution - although there's little chance the two standards are directly related, their particulars will have been landed on by broadly similar methods.
Although the fact is now mainly of historical interest, most larger-tube CRT PC monitors had a maximum horizontal scan rate of 70 kHz or higher, which means they could have handled 2048x1152 at 60 Hz progressive if set to use a custom resolution (with slimmer vertical blanking margins than HD-MAC/Eu95 itself for those rated for less than 75 kHz). Monitors able to support the lower refresh rate, including smaller models incapable of 70 kHz but good for at least 58 kHz (preferably 62.5 kHz) and able to support the lower vertical refresh rate could instead be set to run 50 Hz progressive, or even 100 Hz interlace to avert the flicker that would otherwise cause.
TV transmission systems