|
History and IMAX 3D TechnologyHow we see The fact that our left eye and right eye see objects from different angles is the basis of 3D photography. If you try looking at an object through one eye and then the other, you will notice that it slightly changes position. However, with both eyes open, the two images that each eye observes separately are fused together as one by our brain. It is the fusion of these two images that creates normal binocular sight and allows our brain to understand depth and distance. Pre-cinema To replicate this process on film, two camera lenses are used in place of our two eyes. In 1838, Charles Wheatstone invented the world's first stereoscopic viewer based on Renaissance theories of perspective. Constructed of an assortment of angled mirrors, his invention contained two separate drawings - one for the left eye and one for the right. When both images were observed at the same time, Wheatstone's viewing device produced a stereo image. Weatstone's device encouraged the beginning of a new era in motion and still photography. 3D Cinema Filmmakers place the two lenses of a 3D camera at about the same distance apart as the distance between our eyes. This space is referred to as the interocular distance, or interaxial distance, and is typically set at about 2-1/2 inches. To project a 3D film, two individual images representing the perspective of the left and right eye are simultaneously projected on screen. Without special glasses during the presentation, it will seem like you are seeing double, because in fact you are seeing double. Fortunately, the 3D glasses correct this problem. Each lens of the 3D glasses has a special filter which blocks out the opposing image, allowing each eye to see only one image. Your brain perceives the fusion of the two separate images as one three-dimensional image. Projecting 3D Film There are several ways to project the dual images necessary to exhibit a 3D film; however, not all processes require two separate projectors. The anaglyphic film format simultaneously projects two different, offset images from one single strip of film. One image is coated with a green (or blue) colour; the other image is coated red. Spectators are given glasses that sort one green (or blue) lens and one red lens. The green lens of the glasses cancels out the red image on screen, while the red lens of the glasses cancel out the green (or blue) image on the screen. Your brain processes the two separated images as one 3D 'black and white' image! To see 3D in colour, the images for the left and right eye must be kept separate. Before the advent of today's large format theaters, which use two separate synchronized projectors, previous methods placed two 35mm frames in various configurations, either over and under each other or side by side. Modern Improvements Contemporary 3D films have begun to use computer generated imagery (CGI) to maximize the 3D illusion. Use of computer created images allows filmmakers total control over convergence and focus, the two most problematic aspects of live action 3D production. By creating the environment in the computer, the point of convergence can be precisely set by the filmmaker. Furthermore, the entire frame can be kept in focus, something nearly impossible to do when shooting by conventional means. What this means is that when the film is projected onto the screen, you will absorb the visual information much like you would in the real world, thus maximizing the illusion.~2889
Comments: a fusion -интеграция, объединение, слияние, сращивание stereoscopic – объемный, стереоскопический an assortment -классификация, сортировка interocular – межокулярный, название расстояния между двумя осями линз interaxial – междуосный, название расстояния между двумя осями линз anaglyphic – анаглифный. Анаглифический метод основан на свойствах светофильтров пропускать одни и задерживать другие лучи. Как правило, при построении изображения используются два светофильтра - red и aqua (красный и смесь зелёного и синего), но можно использовать другие очки (красно-зелёные, красно-синее). an offset -противовес,контраст a convergence -конвергенция; сближение
HDTV HDTV stands for High Definition TV and while the FCC does not have a standard definition for HDTV, it is widely agreed upon that HDTV is defined as having higher quality video, audio and a wider image aspect ratio than standard television broadcast signals. HDTV is part of a larger set of standards called ATSC (Advanced Television Systems Committee). This is a group which defines the standards for digital television transmission in the United States and many other counties. The FCC (Federal Communications Commission) has mandated that all licensed television stations be capable of broadcasting DTV by 2007. To understand how much higher the quality HDTV is, you need to know the quality of standard TV. In the US, a standard TV has 525 scanned lines for each image. An image is refreshed every 30th of a second, however only half the image is refreshed every 30th of a second, so a full image is refreshed every 60th of a second. This format of refreshing an image is called interlaced. Progressive is when an image is refreshed in its entirety every 30th of a second. Out of the 525 lines of resolution that are scanned, only 480 lines are visible on the TV. Standard TV is also known as 480i (480 lines of usable resolution, interlaced). HDTV Resolution There are 4 formats that can be viewed on a High Definition TV; they are 480i, 480p, 720p and 1080p. The higher the number, the higher the resolution. Although HDTV's can play all 4 formats, most experts refer to High Definition TV broadcasts as having either 720p or 1080p resolution. High quality image resolution is the main selling point for HDTV's. All HDTV's signals are digital signals; no longer does your TV rely on analog signals for broadcasts. Most HDTV's are able to process either HDTV format (720p and 1080p). HDTV signals require 19.39Mbps of bandwidth -- five times the bandwidth of standard TV signals. This is true even though HDTV utilizes MPEG-2 (Motion Picture Experts Group - 2 that is a compression standard for digital television.) compression to conserve as much bandwidth as possible. The 720p format offers 720 lines of horizontal resolution with progressive scan. Progressive scan means that every line is refreshed in each frame update. The 1080i format offers 1080 lines of horizontal resolution with interlacing. Interlacing means that every other line is refreshed in each frame update. This means that it requires two frame updates to repaint the entire screen. 1080p offers the best of both worlds, 1080 lines of progressively scanned video. HDTV Screen Ratio Standard TV's use a 4.3 aspect ratio. Aspect Ratio is the ratio of a picture's width to its height. This means that the screen format is more like a square than a cinema screen. Cinema screens usually have an aspect ratio of 16:9, which also is the screen ratio of all HDTV screens. This means that you can watch most movies on your TV as they would be shown in the theater. Most TV's crop off the sides of movies and programming that does not fit the ratio, so you are missing about 1/3 of the picture. However, with HDTV, you get to see the entire image without any cropping or letter boxes. HDTV Digital Sound Another great feature is that HDTV is able to receive and reproduce 5.1 independent channels of digital sound. This format is generally termed Dolby AC3 and reproduces CD quality digital sound. 5.1 means that you can hook up 5 separate speakers, plus one subwoofer. AC3 is the audio format utilized by ATSC. The 5 speakers hooked up are usually part of a home theater surround sound system and consist of 2 front channel speakers, one center channel speaker and 2 rear or sometimes referred to as surround sound speakers. It is important to note that HDTV requires either a built in HDTV receiver or a stand alone receiver to watch HDTV programming. Just having a High Definition television set will not allow you to receive HDTV broadcast and view them on your HDTV set. You must also have an HDTV receiver. Watching High Definition TV There are three ways to watch High Definition TV. The first way is to receive free broadcasts via the airwaves. You just need an HDTV, a HDTV receiver and an antenna. You can pick up HDTV signals from local broadcasters. These channels usually include all the major networks such as NBC, CBS, ABC, FOX and PBS. The other way to watch High Definition TV is to have a cable or satellite signal piped into your home using an addressable set top box. Most cable and satellite TV boxes include the feature to view HDTV signals. The third way to watch High Definition TV is with a DVD or DVR player. Many DVD players can play progressive format video which includes 780p. Newer DVD or DVR players can play formats of either 780p or 1080p. HD Television Sets Most of today's HDTV sets come in either LCD or Plasma. These sets are usually about 30% more expensive than traditional TV sets and can be thousands of dollars more for extremely large sets, usually over 60 inches. All HD television sets are in 16:9 ration, have outputs for 5.1 AC3 digital sound and are extremely thin. Most LCD and Plasma TV's are only a few inches thick; ranging from about 2 inches to less than 6 inches in width.~ 4289
Comments: Dolby -системы обработки звука, созданные фирмой «Dolby Laboratories, Inc.» («Dolby Labs»), руководимой Реем Долби, пионером аудио - и видеопромышленности. to hook up -подключать NBC -National Broadcast Company — Национальная вещательная компания) - одна из крупнейших американских телерадиовещательных компаний. Основана в 1926 году. CBS -Broadcasting Inc. — американская телерадиосеть. Название происходит от Columbia Broadcasting System - прежнего юридического названия компании. FOX -Fox Broadcasting Company - американская телекомпания. Владельцем Fox является Fox Entertainment Group. Одна из крупнейших телекомпаний мира. PBS -Public Broadcasting Service - Общественная вещательная компания США. DVR -digital video recorder - устройство, которое делает запись видео в цифровом формате на дисковод или другой носитель в пределах устройства.
MPEG MPEG (Moving Picture Experts Group) is an International Standards Organization (ISO) group which sets standards for compressing and storing video, audio, and animation in digital form. Moving Picture Experts Group's first meeting was in Ottawa, Canada, in May of 1988. Over the years, MPEG has developed to include around 350 members per meeting from several industries, research institutions, and universities. The official designation of Moving Picture Experts Group is ISO/IEC JTC1/SC29 WG11. Pronounced "m-peg", the term MPEG represents the entire digital video compression techniques and the digital file formats created by the Moving Picture Experts Group. Generally, MPEG can create high quality video files when compared with other competing formats like Video for Windows, QuickTime, and Indeo. MPEG files can be decoded with the help of software programs or by using special hardware. MPEG files attain high compression rates by only storing the changes which occur between two frames, rather than storing the entire frame. The technique used by MPEG to encode video information is known as DCT. Much like JPEG (Joint Photographic Experts Group is a lossy compression technique for color pictures.), MPEG utilizes a lossy compression technique in which certain data is removed from the files. However, end users cannot normally notice a reduction in quality as the reduction of data is hardly noticeable to the human eye. MPEG Standards Though there are several MPEG standards, MPEG-1, MPEG-2, and MPEG-4 are three most popular MPEG standards: MPEG-1: MPEG-1, the first video and audio compression standard, supports a video resolution of 352x240 at the rate of 30 fps (frames per second). However, the video quality of MPEG-1 is slightly lower than the video quality offered by a normal VCR. MPEG-1 also has the ability to include audio compressed in the MP3 audio format. MPEG-2: MPEG-2 can support video resolutions of 720x480 and 1280x720 at 60 frames per second, with an audio quality equal to conventional CD audio. MPEG-2 is suitable for almost all television standards, including ATSC, NTSC and HDTV. MPEG-2 has the capability to reduce a two hour video file to a few gigabytes of data. Encoding video to MPEG-2 requires fairly significant processing power, but decoding MPEG-2 data to video is not as processor intensive. The MPEG-2 standards are also used to store data on DVD's. MPEG-4: Introduced in late 1998, MPEG-4 is base on MPEG-1, MPEG-2, and Apple QuickTime technology. This graphics and video compression algorithm standard comes with the ability to create wavelet-based files which are smaller than QuickTime or JPEG files. MPEG-4 files are designed so as to transmit images and video while using less network bandwidth. MPEG-4 files can combine video with graphics, text, and 2-D and 3-D animation layers. Additional features which can be seen in MPEG-4 include object oriented composite files (such as video, audio, and VRML objects), VRML support for 3D rendering, and support for externally specified DRM (Digital Rights Management). Other MPEG standards MPEG-3: MPEG-3 was originally developed for HDTV; but as the MPEG-2 standard was found to be more efficient for HDTV, MPEG-3 was abandoned. MPEG-7: MPEG-7 is a formal standard for illustrating multimedia content. MPEG-21: MPEG-21 is designed to share machine-readable license information in an "ubiquitous, unambiguous and secure" manner. ~2896
Comments: MPEG (Moving Picture Experts Group) - экспертная группа по вопросам движущегося изображения. Группа специалистов в подчинении ISO, собирающаяся для выработки стандартов сжатия цифрового видео и аудио. QuickTime -собственническая технология Apple Computer, разработанная в 1989 году, для воспроизведения цифрового видео, звука, текста, анимации, музыки и панорамных изображений в различных форматах. Indeo Video – видеокодек, разработанный Intel в 1992. Видеокодекпрограмма/алгоритм сжатия (то есть уменьшения размера) видеоданных (видеофайла, видеопотока). DCT -Discrete Cosine Transform - дискретное косинусное преобразование. Одно из ортогональных преобразований. Вариант косинусного преобразования для вектора действительных чисел. VCR -videocassette recorder. Видеомагнитофон — устройство для записи или чтения видеосигнала на магнитную ленту. a wavelet –короткий волновой цуг VRML – Virtual Reality Modeling Language — язык моделирования виртуальной реальности. Стандартный формат файлов для демонстрации трёхмерной интерактивной векторной графики, чаще всего используется в WWW. DRM -Digital Rights Management - Технические средства защиты авторских прав (ТСЗАП). Ими являются чаще программные, реже программно-аппаратные средства, которые затрудняют создание копий защищаемых произведений (распространяемых в электронной форме), либо позволяют отследить создание таких копий. The Future of Television Much of the new technology has come from Nippon Television Network, Japan's largest commercial TV company. Its system is known as progressive-scan digital television. Negotiations to put this into use through one of Japan's three new digital-satellite-television services (PerfecTV!, DirecTV and JSkyB) are expected to be concluded within a month. What distinguishes progressive-scan broadcasting from conventional television is that it transmits the full 525 lines of the screen image (i.e, one full frame) every sixtieth of a second. Current technology sends only half the lines in a frame (first the odd-numbered ones, then the even-numbered ones) every sixtieth of a second. The picture perceived by the eye is an optical illusion created by the "interlacing" of the two alternating sets of lines. Transmitting signals this way helps reduce the amount of bandwidth needed for broadcasting. The price viewers pay is a blurrier image and a slightly flickering screen. Such compromises, however, are no longer necessary. Unlike its analogue counterpart, the signal used in digital television can easily be compressed by a computer chip to remove redundant information and thus make it more compact. The picture is then decompressed by a second chip in the receiver. This allows what is actually broadcast to remain well within the bandwidth available even when it is transmitted a full frame at a time. So a 525-line progressive-scan system offers twice the resolution of a conventional 525-line interlaced system. That means that the progressive-scan system is delivering the same amount of information to the screen as a 1,050-line interlaced television—not far short of the resolution of the 1,125-line interlaced HiVision picture. Engineers at Nippon Television reckon that the decoder needed to decompress the digital signal at the receiving end should add no more than 10% to the price of an existing wide-screen television set. Indeed, receivers capable of showing progressive-scan pictures as well as ordinary interlaced ones have already been announced by Sony, Matsushita and JVC. Prices start at around ¥330,000 ($2,850) for 80cm (32-inch) models. But pin-sharp pictures for modest outlays are only the beginning. The video techniques can also be used to broadcast images in three dimensions. Nippon Television is working on this idea, too. Its system, developed jointly with Sanyo Electric, is also based on progressive scanning. The main difference is that two closely spaced cameras are used in the studio to feed stereoscopic information to the encoders that compress the image for transmission. To squeeze the two images into a single frame for transmission, both must first be squashed to half their normal height. (Their widths remain unaltered.) The easiest way to do that is to use interlacing. In this sense, the arrangement works more like conventional broadcasting. The difference is that, thanks to compression, twice as much information is transmitted in any given frame – the left-eye perspective and a slightly different right-eye version. At the receiving end, the composite image is decompressed and split into its right-eye and left-eye components. These are then displayed alternately on the TV set as if they were ordinary interlaced images. The result is a picture with the same quality as a 525-line television but displaying two sets of slightly different images on the screen. To see them in 3D, a viewer must wear special spectacles. The lenses of these spectacles contain shutters made of liquid crystals (chemicals that can be rendered temporarily opaque by the application of an electric current). When an image is flashed on the screen, an infrared beam from the TV set momentarily closes the shutter in front of the eye that is not supposed to see it. Strictly, even the spectacles are unnecessary. By building special lenses into the surface of a television screen, the two images can be beamed separately to the appropriate eyes. The unanswered question is whether the public is ready for 3D television. Clearly, TV producers will have to learn how to use it effectively; too much of it could quickly become tiresome.~3505
Comments: Nippon Television Network -корпорация телевизионной сети Ниппон. Японская телевизионная сеть, расположенная на площади Сёдомэ в Токио. Она также известна, как Nihon TV или NTV. В концепцию вещания Nippon Television входят: информационная программа, кино, спорт, развлекательные передачи, мультфильмы аниме и т. д. a blurring - размывание границ (метод имитации зрительного восприятия движущихся объектов) flickering -мерцательный, мерцающий redundant – избыточный, излишний, резервированный, чрезмерный to reckon -считать, подсчитывать, вычислять; насчитывать; подводить итог modest outlay — скромные расходы to squash -сжимать, сдавливать to split – распределять, расщеплять a shutter – обтюратор, затвор, заслонка opaque -непрозрачный; непроницаемый, темный
Computers The Invention of Computer There is not just one inventor of the computer, as the ideas of many scientists and engineers led to its invention. These ideas were developed in the 1930s and 1940s, mostly independently of each other, in Germany, Great Britain and the USA, and were turned into working machines. In Germany, Konrad Zuse hit upon the idea of building a program-controlled calculating machine when he had to deal with extensive calculations in statics. In 1935, he began to design a program-controlled calculating machine in his parents' home in Berlin. It was based on the binary system and used punched tape for the program input. The Z1, which was built between 1936 and 1938, was a purely mechanical machine which was not fully operational. In 1940, Zuse began to build a successor to the Z1 based on relay technology. In May 1941, he finished the Z3 - worldwide the first freely programmable program-controlled automatic calculator that was operational. Several similar developments were in progress in the USA at the same time. In 1939, IBM started to build a program-controlled relay calculator on the basis of a concept that Howard H. Aiken had put forward in 1937. This machine - the IBM Automatic Sequence Controlled Calculator (Mark I) - was used on production work from 1944. However, it was not Aiken's and Stibitz's relay calculators that were decisive for the development of the universal computer but the ENIAC, which was developed at the Moore School of Electrical Engineering at the University of Pennsylvania. Extensive ballistic computations were carried out there for the U.S. Army during World War II with the aid of a copy of the analog Differential Analyzer, which had been designed by Vannevar Bush, and more than a hundred women working on mechanical desk calculators. Nevertheless, capacity was barely sufficient to compute the artillery firing tables that were needed. In August 1942, John W. Mauchly, a physicist, presented a memo at the Moore School for a vacuum tube computer that was conceived as a digital version of the Differential Analyzer. Mauchly had adopted John Vincent Atanasoff's idea for an electronic computer. Atanasoff had developed the ABC special-purpose computer at the Iowa State College (now Iowa State University) to solve systems of linear equations. Mauchly had viewed the ABC in June 1940. John Presper Eckert, a young electronic engineer at the Moore School, was responsible for the brilliant engineering of the new ENIAC. The work began on 31 May 1943 with funding from the U.S. Army. In February 1946, successful program runs were demonstrated. At almost the same time, the Model I to Model VI relay calculators were built at Bell Laboratories in New York following a suggestion by George R. Stibitz. John von Neumann, an influential mathematician, turned his attention to the ENIAC in the summer of 1944. While this computer was being built, von Neumann and the ENIAC team drew up a plan for a successor to the ENIAC. The biggest problem with the ENIAC was that its memory was too small. Eckert suggested a mercury delay-line memory which would increase memory capacity by a factor of 100 compared with the electronic memory used in the ENIAC. An equally big problem was programming the ENIAC, which could take hours or even days. In meetings with von Neumann, the idea of a stored-program, universal machine evolved. Memory was to be used to store the program in addition to data. This would enable the machine to execute conditional branches and change the flow of the program. The concept of a computer in the modern sense of the word was born. In spring 1944, von Neumann wrote his "First Draft of a Report on the EDVAC which described the stored-program, universal computer. The logical structure that was presented in this draft report is now referred to as the von Neumann architecture. This EDVAC report was originally intended for internal use only but it became the "bible" for computer pioneers throughout the world in the 1940s and 1950s. The first two computers featuring the von Neumann architecture were not built in America but in Great Britain. On 21 June 1948, Frederic C. Williams of the University of Manchester managed to run the prototype of the Manchester Mark I, and thus proved it was possible to build a stored-program, universal computer. The first really functional von Neumann computer was built by Maurice Wilkes at Cambridge University. This machine called EDSAC first ran a program on 6 May 1949 computing a table of square numbers.~3758
Comments: a punched tape – перфораторная лента IBM ( International Business Machines) - транснациональная корпорация со штаб-квартирой в Армонк, штат Нью-Йорк (США), один из крупнейших в мире производителей вычислительной техники, периферийного оборудования, программного обеспечения, и консалтинга. ENIAC ( Electronic Number Integrator And Computer) – электронный числовой интегратор и вычислитель. Первый широкомасштабный, электронный, цифровой компьютер, который можно было перепрограммировать для решения полного диапазона задач. ABC – American Broadcasting Corporation – американская радиовещательная корпорация EDVAC ( Electronic Discrete Variable A utomatic Computer) - одна из первых электронных вычислительных машин. В отличие от ENIAC, это компьютер на двоичной, а не десятичной основе. a delay-line memory - память [запоминающее устройство] на линиях задержки EDSAC ( Electronic Delay Storage Automatic Computer) - электронная вычислительная машина. Первый действующий в мире и практически используемый компьютер с хранимой в памяти программой. Архитектура компьютера наследовала архитектуру американского EDVAC. Personal Computers Personal Computers, microcomputers were made possible by two technical innovations in the field of microelectronics: the integrated circuit, or IC, which was developed in 1959; and the microprocessor, which first appeared in 1971. The IC permitted the miniaturization of computer-memory circuits, and the microprocessor reduced the size of a computer's CPU to the size of a single silicon chip. The invention of the microprocessor, a machine which combines the equivalent of thousands of transistors on a single, tiny silicon chip, was developed by Ted Hoff at Intel Corporation in the Santa Clara Valley south of San Francisco, California, an area that was destined to become known to the world as Silicon Valley because of the microprocessor and computer industry that grew up there. Because a CPU calculates, performs logical operations, contains operating instructions, and manages data flows, the potential existed for developing a separate system that could function as a complete microcomputer. The first such desktop-size system specifically designed for personal use appeared in 1974; it was offered by Micro Instrumentation Telemetry Systems (MITS). The owners of the system were then encouraged by the editor of a popular technology magazine to create and sell a mail-order computer kit through the magazine. The computer, which was called Altair, retailed for slightly less than $400. The demand for the microcomputer kit was immediate, unexpected, and totally overwhelming. Scores of small entrepreneurial companies responded to this demand by producing computers for the new market. The first major electronics firm to manufacture and sell personal computers, Tandy Corporation (Radio Shack), introduced its model in 1977. It quickly dominated the field, because of the combination of two attractive features: a keyboard and a cathode-ray display terminal (CRT). It was also popular because it could be programmed and the user was able to store information by means of cassette tape. Soon after Tandy's new model was introduced, two engineer-programmers—Stephen Wozniak and Steven Jobs—started a new computer manufacturing company named Apple Computers. I In 1976, in what is now the Silicon Valley, Steve Jobs and Steve Wozniak created a homemade microprocessor computer board called Apple I. Working from Jobs’ parents’ garage, the two men began to manufacture and market the Apple I to local hobbyists and electronics enthusiasts. Early in 1977, Jobs and Wozniak founded Apple Computer, Inc., and in April of that year introduced the Apple II, the world’s first personal computer. Based on a board of their design, the Apple II, complete with keyboard and color graphics capability, retailed for $1290.Some of the new features they introduced into their own microcomputers were expanded memory, inexpensive disk-drive programs and data storage, and color graphics. Apple Computers went on to become the fastest-growing company in U.S. business history. Its rapid growth inspired a large number of similar microcomputer manufacturers to enter the field. Before the end of the decade, the market for personal computers had become clearly defined. In 1981, IBM introduced its own microcomputer model, the IBM PC. Although it did not make use of the most recent computer technology, the PC was a milestone in this burgeoning field. It proved that the microcomputer industry was more than a current fad, and that the microcomputer was in fact a necessary tool for the business community. The PC's use of a 16-bit microprocessor initiated the development of faster and more powerful micros, and its use of an operating system that was available to all other computer makers led to a de facto standardization of the industry. In the mid-1980s, a number of other developments were especially important for the growth of microcomputers. One of these was the introduction of a powerful 32-bit computer capable of running advanced multi-user operating systems at high speeds. This has dulled the distinction between microcomputers and minicomputers, placing enough computing power on an office desktop to serve all small businesses and most medium-size businesses. Another innovation was the introduction of simpler, "user-friendly" methods for controlling the operations of microcomputers. By substituting a graphical user interface (GUI) for the conventional operating system, computers such as the Apple Macintosh allow the user to select icons—graphic symbols of computer functions—from a display screen instead of requiring typed commands. Douglas Engelbart, invented an "X-Y Position Indicator for a Display System": the prototype of the computer "mouse" whose convenience has revolutionized personal computing. New voice-controlled systems are now available, and users may eventually be able to use the words and syntax of spoken language to operate their microcomputers. ~ 4144
Comments: IC - integrated circuit – интегральная схема CPU – central processing unit - центральный процессор Micro Instrumentation Telemetry Systems ( MITS) - a kit - оборудование to inspire – способствовать, влиять, воздействовать a fad - прихоть, причуда; фантазия de facto - на деле, фактически, де-факто
Master of Invention Nolan Bushnell (Born in 1943) The father of home video games. He built Pong in 1972, starting the video-game craze that led to today's powerful super systems. During the 1950's and 1960's, computers improved enormously. Still, only big businesses, universities and the military had them. Then in 1972, the video-game craze began. Computers were scaled down to small boxes, using electronic circuitry instead of the Mark Fs switches. They could do more than analyse data. They could play games. The first big hit was a simple game called Pong. Two players sat in front of a television screen where a "ball" - a point of light - bounced back and forth. Using knobs on a cabinet, the players could hit the ball with inch-long "paddles" on the screen. Pong was created by Nolan Bushnell, who grew up near Salt Lake City, Utah. He loved to tinker with machines and became an electrical engineer. He played primitive computer games that were even older than Pong. "I build it with my own two hands and a soldering iron," Bushnell said of his creation of the first Pong game. In 1972 Bushnell founded Atari Inc. In Sunnyvale, California, to build Pong games. By 1975 there were 150,000 Pong games in American homes. Steve Wozniak (Born in 1950) and Steven Jobs (Born in 1955) Working out of a garage, the young video game fanatics invented the Apple computer in 1976. The age of home computers was born. One of Atari's early employees was 19-year-old Steve Wozniak, who worked for another computer company, both loved video games. Jobs and Wozniak dreamed of a personal computer, one that could do more than play games. From this dream, the Apple Computer Company started in family garage. In 1977 Jobs and Wozniak sold their first Apple II, which launched the personal computer industry. By 1985 they had sold more than two million Apple II's. The Apple II was more than a toy. People could use it to write letters, to keep financial records and teach their children. And, yes, they could play games on it. The Apply II envolved into today's high-tech Macintosh computers. These computers popularised the use of the mouse, the hand-controlled device that moved the cursor on a computer display.~1814
Comments: Pong -видеоигра, игровой процесс которой основан на пинг-понге. a paddle -лопатка to tinker -паять soldering – паяльный, спаянный to evolve -развивать Macintosh или Mac - линейка персональных компьютеров, спроектированных, разработанных, производимых и продаваемых фирмой Apple Inc. Работают под управлением операционной системы Mac OS (в настоящее время — Mac OS X). Своё название получили от сорта яблок «Макинтош» (McIntosh). Microsoft Windows Microsoft Windows (or simply Windows) is a software programme that makes your IBM PC (or compatible) easy to use. It does this by simplifying the computer's user interface. The word interface refers to the way you give your computer commands, the way you interact with it. Usually the interface between you and the computer consists of the screen and the keyboard; you interact with the computer by responding to what's on the screen, typing in commands at the DOS command line to do your work. DOS often isn't very intelligent at interpreting your commands and most people consider it awkward or intimidating as a user interface. These commands can be confusing and difficult to remember. Who wants to learn lots of computer commands just to see what's on your disk, copy a file, or format a disk? Windows changes much of this. What's been missing from the PC is a programme that makes the computer easy to use. Windows is just such a program. With Windows, you can run programmes, enter and move data around, and perform DOS-related tasks simply by using the mouse to point at objects on the screen. Of course, you also use the keyboard to type in letters and numbers. Windows interprets your actions and tells DOS and your computer what to do. In addition to making DOS housekeeping tasks such as creating directories, copying files, deleting files, formatting disks, and so forth, easier, Windows makes running your favorite applications easier, too. (An application is a software package that you use for a specific task, such as word processing). Windows owes its name to the fact that it runs each programme or document in its own separate window. (A window is a box or frame on the screen.) You can have numerous windows on the screen at a time, each containing its own programme and/or document. You can then easily switch between programs without having to close one down and open the next. Another feature is that Windows has a facility - called the Clipboard - that lets you copy material between dissimilar document types, making it easy to cut and paste information from, say, a spreadsheet into a company report or put a scanned photograph of a house into a real estate brochure. In essence, Windows provides the means for seamlessly joining the capabilities of very different application programs. Not only can you paste portions of one document into another, but by utilizing more advanced document-linking features those pasted elements remain "live". That is, if the source document (such as some spreadsheet data) changes, the results will also be reflected in the secondary document containing the pasted data. As more and more application programmes are written to run with Windows, it'll be easier for anyone to learn how to use new programmes. This is because all application programmes that run in Windows use similar commands and procedures. Windows comes supplied with a few of its own handy programmes. There's a word-processing programme called Write, a drawing programme called Paintbrush, a communications programme called Terminal for connecting to outside information services over phone lines, small utility programmes that are helpful for keeping track of appointments and notes, a couple of games to help you escape from your work, and a few others. Years of research went into developing the prototype of today's popular graphical user interfaces. It was shown in the early 1980s that the graphical user interface, in conjunction with a hand-held pointing device (now called the mouse), was much easier to operate and understand than the older-style keyboard-command approach to controlling a computer. A little-known fact is that this research was conducted by the Xerox Corporation and first resulted in the Xerox Star computer before IBM PCs or Macintoshes existed. It wasn't until later that the technology was adapted by Apple Computer for its Macintosh prototype, the Lisa. ~3273
Comments: DOS ( Disk Operating System) - дисковая операционная система, семейство операционных систем для персональных компьютеров, которое ориентировано на использование дисковых накопителей, таких как жёсткий диск и дискета. to switch – переключать to paste -вставлять Clipboard -буфер обме́на — промежуточное хранилище данных, предоставляемое операционной системой и доступное для приложений через определённый интерфейс.
Windows XP Windows XP is a line of operating systems developed by Microsoft for use on general-purpose computer systems, including home and business desktops, notebook computers, and media centres. The letters "XP" stand for eXPerience Codenamed "Whistler" after Whistler, British Columbia, as many Microsoft employees skied at the Whistler-Blackcomb ski resort during its development, Windows XP is the successor to both Windows 2000 and Windows Me, and is the first consumer-oriented operating system produced by Microsoft to be built on the Windows NT kernel and architecture. Windows XP was first released on October 25, 2001, and over 400 million copies are in use, according to a January 2006 estimate by an IDC analyst. It is succeeded by Windows Vista, which was released to volume license customers on November 8, 2006 and worldwide to the general public on January 30, 2007. The most common editions of the operating system are Windows XP Home Edition, which is targeted at home users, and Windows XP Professional, which has additional features such as support for Windows Server domains and dual processors, and is targeted at power users and business clients. Windows XP Media Center Edition has additional multimedia features enhancing the ability to record and watch TV shows, view DVD movies, and listen to music. Windows XP Tablet PC Edition is designed to run the ink-aware Tablet PC platform. Two separate 64-bit versions of Windows XP were also released, Windows XP 64-bit Edition for IA-64 (Itanium) processors and Windows XP Professional x64 Edition for x86-64 processors. Windows XP is known for its improved stability and efficiency over previous versions of Microsoft Windows. It presents a significantly redesigned graphical user interface, a change Microsoft promoted as more user-friendly than previous versions of Windows. New software management capabilities were introduced to avoid the "DLL hell" that plagued older consumer versions of Windows. It is also the first version of Windows to use product activation to combat software piracy, a restriction that did not sit well with some users and privacy advocates. Windows XP has also been criticized by some users for security vulnerabilities, tight integration of applications such as Internet Explorer and Windows Media Player, and for aspects of its user interface. Windows XP had been in development since early 1999, when Microsoft started working on Windows Neptune, an operating system intended to be the "Home Edition" equivalent to Windows 2000 Professional. It was eventually cancelled and became Whistler, which later became Windows XP. Many ideas from Neptune and Odyssey (another cancelled Windows version) were used in Windows XP.~2294
Comments: a desktop – настольный компьютер Blackcomb ski resort - Уистлер-Блэккомб — лыжный курорт в канадской провинции Британская Колумбия Windows NT kernel -является частью семейства операционных систем на ядре NT, повторно используемая операционная система с приоритетным прерыванием. Она разработана для работы, как с однопроцессорными, так и с симметричными мультипроцессорными компьютерами. IDC - International Data Corporation — аналитическая фирма, специализирующаяся на исследованиях рынка информационных технологий Windows Server domain — группа компьютеров одной сети, имеющих единый центр (который называется контроллером домена), использующий единую базу пользователей, единую групповую и локальную политики, единые параметры безопасности, ограничение времени работы учётной записи и прочие параметры, значительно упрощающие работу системного администратора организации, если в ней эксплуатируется большое число компьютеров. Itanium -микропроцессор с архитектурой IA-64, разработанный совместно компаниями Intel и Hewlett-Packard. DLL hell -(DLL-кошмар, буквально: DLL-ад) - тупиковая ситуация, связанная с управлением динамическими библиотеками DLL в операционной системе Microsoft Windows.
Windows Vista Microsoft began work on Windows Vista, known at the time by its codename Longhorn, in May 2001, five months before the release of Windows XP. It was originally expected to ship sometime late in 2003 as a minor step between Windows XP and Blackcomb, which was planned to be the company's next major operating system release. Gradually, "Longhorn" assimilated many of the important new features and technologies slated for Blackcomb, resulting in the release date being pushed back several times. Many of Microsoft's developers were also re-tasked to build updates to Windows XP and Windows Server 2003 to strengthen security. Faced with ongoing delays and concerns about feature creep, Microsoft announced on August 27, 2004 that it had revised its plans. The original Longhorn, based on the Windows XP source code, was scrapped, and Longhorn's development started anew, building on the Windows Server 2003 Service Pack 1 codebase, and re-incorporating only the features that would be intended for an actual operating system release. Some previously announced features such as WinFS were dropped or postponed, and a new software development methodology called the Security Development Lifecycle was incorporated in an effort to address concerns with the security of the Windows codebase. After Longhorn was named Windows Vista in July 2005, an unprecedented beta-test program was started, involving hundreds of thousands of volunteers and companies. In September of that year, Microsoft started releasing regular Community Technology Previews (CTP) to beta testers. The first of these was distributed at the 2005 Microsoft Professional Developers Conference, and was subsequently released to beta testers and Microsoft Developer Network subscribers. The builds that followed incorporated most of the planned features for the final product, as well as a number of changes to the user interface, based largely on feedback from beta testers. Windows Vista was deemed feature-complete with the release of the "February CTP", released on February 22, 2006, and much of the remainder of work between that build and the final release of the product focused on stability, performance, application and driver compatibility, and documentation. Beta 2, released in late May, was the first build to be made available to the general public through Microsoft's Customer Preview Program. It was downloaded by over five million people. Two release candidates followed in September and October, both of which were made available to a large number of users. While Microsoft had originally hoped to have the consumer versions of the operating system available worldwide in time for Christmas 2006, it was announced in March 2006 that the release date would be pushed back to January 2007, in order to give the company–and the hardware and software companies which Microsoft depends on for providing device drivers–additional time to prepare. Development of Windows Vista came to an end when Microsoft announced that it had been finalized on November 8, 2006. Windows Vista cost Microsoft 6 billion dollars to develop. Windows Vista contains many changes and new features, including an updated graphical user interface and visual style dubbed Windows Aero, a redesigned search functionality, multimedia tools including Windows DVD Maker, and redesigned networking, audio, print, and display sub-systems. Vista aims to increase the level of communication between machines on a home network, using peer-to-peer technology to simplify sharing files and digital media between computers and devices. Windows Vista includes version 3.0 of the.NET Framework, allowing software developers to write applications without traditional Windows APIs. Microsoft's primary stated objective with Windows Vista has been to improve the state of security in the Windows operating system. One common criticism of Windows XP and its predecessors is their commonly exploited security vulnerabilities and overall susceptibility to malware, viruses and buffer overflows. In light of this, Microsoft chairman Bill Gates announced in early 2002 a company-wide "Trustworthy Computing initiative" which aims to incorporate security work into every aspect of software development at the company. Microsoft stated that it prioritized improving the security of Windows XP and Windows Server 2003 above finishing Windows Vista, thus delaying its completion. While these new features and security improvements have garnered positive reviews, Vista has also been the target of much criticism and negative press. Criticism of Windows Vista has targeted its high system requirements, its more restrictive licensing terms, the inclusion of a number of new digital rights management technologies aimed at restricting the copying of protected digital media, lack of compatibility with some pre-Vista hardware and software, and the number of authorization prompts for User Account Control. As a result of these and other issues, Windows Vista had seen initial adoption and satisfaction rates lower than Windows XP. However, with an estimated 350 million internet users as of January 2009, it has been announced that Vista usage had surpassed Microsoft’s pre-launch two-year-out expectations of achieving 200 million users. As of the end of May 2009, Windows Vista is the second most widely used operating system on the internet with a 24.35% market share, the most widely used being Windows XP with a 61.54% market share. ~4632
Comments: WinFS - Windows Future Storage — платформа управления данными и метаданными от корпорации Microsoft. Security Development Lifecycle -стадии разработки программного обеспечения. В разработке программного обеспечения, стадии разработки программного обеспечения используются для описания степени готовности программного продукта. a beta - test - публичное тестирование— Стадия активного бета - тестирования и отладки, прошедшей альфа-тестирование (если таковое было). Программы этого уровня могут быть использованы другими разработчиками программного обеспечения для испытания совместимости. Тем не менее, программы этого этапа могут содержать достаточно большое количество ошибок. Что делать, если нет взаимности? А теперь спустимся с небес на землю. Приземлились? Продолжаем разговор... Система охраняемых территорий в США Изучение особо охраняемых природных территорий(ООПТ) США представляет особый интерес по многим причинам... Конфликты в семейной жизни. Как это изменить? Редкий брак и взаимоотношения существуют без конфликтов и напряженности. Через это проходят все... ЧТО И КАК ПИСАЛИ О МОДЕ В ЖУРНАЛАХ НАЧАЛА XX ВЕКА Первый номер журнала «Аполлон» за 1909 г. начинался, по сути, с программного заявления редакции журнала... Не нашли то, что искали? Воспользуйтесь поиском гугл на сайте:
|