CA1217272A - Real time perspective display employing digital map generator - Google Patents

Real time perspective display employing digital map generator

Info

Publication number
CA1217272A
CA1217272A CA000459583A CA459583A CA1217272A CA 1217272 A CA1217272 A CA 1217272A CA 000459583 A CA000459583 A CA 000459583A CA 459583 A CA459583 A CA 459583A CA 1217272 A CA1217272 A CA 1217272A
Authority
CA
Canada
Prior art keywords
terrain
points
observer
line
lines
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired
Application number
CA000459583A
Other languages
French (fr)
Inventor
Paul B. Beckwith, Jr.
Donald S. Bistarkey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harris Corp
Original Assignee
Harris Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harris Corp filed Critical Harris Corp
Application granted granted Critical
Publication of CA1217272A publication Critical patent/CA1217272A/en
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/301Simulation of view from aircraft by computer-processed or -generated image
    • G09B9/302Simulation of view from aircraft by computer-processed or -generated image the image being transformed by computer processing, e.g. updating the image to correspond to the changing point of view
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/301Simulation of view from aircraft by computer-processed or -generated image

Abstract

ABSTRACT
A system for generating a real time perspective view of the terrain lying along an aircraft's flight path accesses terrain data stored in a digital map generator and converts the data into a perspective representation of the terrain on the face of a suitable display such as a cockpit instrument panel CRT. The stored map data that is accessed provides, in real time, information pertaining to elevation features of the terrain over which the aircraft is flying, so that upon conversion to a perspective presentation to the pilot, there is provided a real time perspective image of the contours of the terrain as though the pilot were looking out a windscreen at the terrain in high visibility conditions. The invention also is capable of providing perspective scene rotation and translation (corresponding to roll and pitch of the aircraft).

Description

~3~

FIELD OF THE INVENTION
The present invention relates in general to information display systems and, more particularly, to a digital system for processing digital map data representative of the terrain over which a vehicle su~h as an aircraft is passing, or simulation thereof, and the generation therefrom of a perspective display of the texrain to the pilot.
BACKGROUND OF THE INVENTION
In the navigation of aircraft over a prescribed xoute, contour maps hav~ been con~entionally employed to provide an indication of the terrain over which the aircraft is flying.
This information, together with instrument readings and visual observation of ~he terrain from the cockpi~, enables the pilot to determine altitude and course of the aircraft as it travels along its flight path. For low altitude flying, such as might be experienced in a helicopter, an instantaneou~ picture of the deSai.~s of ~he terrain over which the aircraft is flying is of paramount importance in guiding the aircraft, especially where the contour of the terrain changes rapidly and contains obstacles to the flight of the aircraft.
In copending Canadia.n application Serial No. 393, 822, filed ~anuaxy 8, 1981, ~by Paul ~. seckwith, Jr. en~itled "Digital Map Generator and Display System" ~ assigned to the as~ignee of the presetlt applica~ion, there i8 descr~bed a ~ys~em ~or effectlng the dynamlc di~play of terrain data which is stored \

7~7;~

in digital format and which may be viewed on a cockpit cathode ray tube display in the form of a moving map of the terrain over which the aircrat is flying, offering the pilot an advanced navigational tool not previously provided by conventional terrain mapping schemesO
The system described in that application operates so as to automatically orient ~he moving map, under the control of the aircraft's navigational computer system, to the instantaneous position of the aircraft with a heading-up disposition. Wi~hin ~his system there i6 employed a scene memory which stores terrain data and which is selectively accessed to generate the map display. The stored terrain data encompasses a number of definitive aspects of the terrain including, inter alia, both elevation and cultural information, so that the system is capable of providing a maximum of information to assist the pilot in navigating and controlling the aircraft.
Now, although the generation of a plan view map of the terrain ovex which the aircraft is flying greatly assists the pilot, he still relies on as many information sources as possibl~, including instrument readings and his own view from the cockpit. Unfoxtunately, in restricted visibility conditions, such as nighttime flying~ poor weather, smoke, etc. the pilot's view of what lies ahead o~ the aircraft may provide very little, if any, information. Ideally, the pilot 7~

would prefer to always see the terrain ahead of the aircraft as it appears in daylight, high visibility conditions.
SUMMARY OF THE INVENTION

_ _ In accordance with the present invention the need of the pilot to be provided with an~image of the terrain showing the contours that he would normally expect to see from the cockpit in clear daylight flying conditions is satisfied by a system that is capable of generating a real time perspective view of the terrain lying along the aircraft flight path, regardless of conditions of visibility outside the aircraft. To this end, the system according to the present invelltion is comprised of a simplified hardware scheme that accesses terrain data stored in a digital map generator, such as that stored in the scene memory employed in the system detailed in the above-identified copending application, and converts this data into a perspective representation of the terrain on the face of a suitable display such as a cockpit ins~rument panel CRT. The stored map data that is accessed provides, in real time, infbrmation pextaining to elevation features of the terrain over which the aircraft is flying, so that upon conversion to a perspective presentation to the pilot, thexe is provided a real time perspective image of the contours of the terrain as though the pilot wer~ looking ou~ a windscreen at the terrain in high visibility conditions.
In accordance with a basic embodiment of the present : 3 .

7~

invention, full resolution of a selected field of view (45~ or 90) using only a line buffer display storage scheme is accomplished by translating a perspective ray or line of observation of the terrain map in the plan view in~o a corresponding vertical line of the flat screen of the CRT on which the perspective presentation of the terrain is generated. From an observer's view point (the pilot as he views the terrain beneath and ahead of the aircraft on the display screen) a plurality of observation lines fan out on the terrain map plan,view to define the field of view to be imaged. Projecting these lines onto the vertical display screen (e.g. CRT face plate) generates a corresponding plurality of parallel vertical lines. Through a simpla trigonometric conversion, elevation data for successively addressed points on each observation line in the plan view terrain map stored in the scene memory are translated into pixel locations on the vertical lines on the display. In addition, this elevation data is used to determine the slope of the terrain being imaged.
On the basis of this slope data and stored artificial sun angle information that is accessed in accordance with the heading of the aircraft, various levels of shading for the terrain contour are generated and supplied as brightness value data to ~he pixels of the display, the addresses of which are determined by elevation data of the successively scanned lines 7~7;~

of the observer/pilot's field of view. Because the data to be supplied to the display pixels i5 extracted on a line-by-line basis (field-of-~iew fan lines-to-vertical lines on the pixel display) only a simple line buffer ping-pong scheme is required for carrying out the presentation of pixel data to the display. As normal CRT screen beam scanning is horizontal across the target faceplate, a simple 90 rotation of the display accomplish~s alignment of the CRT sweep lines with the data as written in and read-out of the line buffers.
Rather than rotate the CRT 90, a conventional ping-pong field screen memory may be employed in place of the line buffer to stoxe the pixel data in its original parallel vertical lines format, which data i5 subsequently read out at right angles (horizontally) so as to be aligne~ with the horizontal sGan lines of the cockpit CRT display.
As a further feature of the invention perspective scene rotation ~corresponding to rGll of the aircraft) and vertical screen shift ~corresponding to a change in pitch of the aircraft) may be provided by reading out a subset of the output screen memory of the ping-pong scr~en memory pair described above. In accordance with this embodiment of the invention, the vertical ping-pong screen memory perspective view data is read out so as to allow for real time rotation and vertical shift of the perspective view. Thi~ enables the display to present to the pilot a perspective view of the h _ _ .

7~
01 terrain in accordance with the roll and pitch attitude 02 of the aircraft Erom the basic vertical screen 03 perspective conversion scheme.
04 In accordance with still another feature 05 of the invention, where the projection (display) 06 screen geometry is not flat (e.g. a dome as on the 07 inside of a canopy) a read out address scheme may be 08 mapped to complement the nonlinearities of the screen 09 projecting geometry, so that the described scene is imaged on the screen without distortion (e.g. dome pin ll cushion distortion). The distortion correction 12 pattern may be stored in a ROM and addressed on a 13 pixel-by-pixel basis. Such dome correction may be 14 employed for either the basic embodiment of the invention providing normal perspective display, or it 16 may be used with the perspective rotation scheme 17 mentioned above.
18 According to the present invention, there 19 is provided for use with a terrain map storage apparatus in which data representative of the 21 elevation of the terrain over a prescribed 22 geographical area is stored, a method of producing on 23 a display screen, a perspective image of the terrain 24 to an observer, comprising the steps of establishing the geographical position of the observer on the 26 terrain map, translating points which lie along a 27 plurality of first lines, the first lines extending 28 - 6 ~

L7~7;~

01 from the geographic-l position of the observer and 02 traversing the map, onto locations on the display 03 screen in accordance with the effective intersections 04 of a plurality of second lines with a prescribed image 05 window, the image window having an effective elevation 06 and geographical position on the terrain map 07 corresponding to the display screen as seen by the 08 observer, the second lines extending from the 09 effective elevation of the observer at the established geographical position thereof through the points on 11 the terrain map, and producing at translated locations 12 on the display screen respective images of the points 13 on the terrain.
14 According to the present invention, there is also provided for use with a terrain map storage 16 apparatus in which data representative of the 17 elevation of the terrain over a prescribed 18 geographical area is stored, a method of generating 19 display information that is to be coupled to a display apparatus for producing, on a display screen thereof, 21 a perspective image of the terrain to an observer.
22 The method comprises the steps of establishing the 23 geographical position of the observer on the terrain 24 map, identifying a plurality of points on the terrain map along each of a plurality of first lines, the 26 first lines extending from the geographical position 27 of the observer and traversing the map, translating 28 - 6a -7~7~
01 points on respective ones o~ the first lines on the 02 terrain map onto successive pixel locations on the 03 display screen that lie along respective vertical 04 lines which, as viewed by the observer on the display 05 screen, coincide with the respective ones of the first 06 lines, and ~or each of the translated points, 07 generating a respective image signal for the 08 associated point on the terrain.
09 In addition, according to the present invention there is provided, for use with a terrain ll map storage device in which data representative of the 12 elevation of the terrain over a prescribed 13 geographical area is stored, an apparatus for 14 generating display information that is to be coupled to a display device for producing, on a display screen 16 thereof, a perspective image of the ter.rain to an 17 observer. The apparatus is comprised of first means 18 for accessing the storage device so as to obtain 19 respective elevation data values for a plurality of points on the terrain map along each of a plurality of 21 first lines, the first lines extending from a location 22 on the terrain map corresponding to an established 23 geographical position of -the observer and traversing 24 the map, second means, coupled to the first means and to the terrain map storage device, for generating, for 26 points on respective ones of the first lines on the 27 terrain map, display screen pixel location signals, 28 - 6b -~:~ 3 ~'7~
01 representative of successive pixel locations on the 02 display screen that lie along respective vertical 03 lines which, as viewed by the observer on the display 04 screen, coincide with the respective ones of the first 05 lines. Also, third means are provided, coupled to the 06 first and second means, for generating respective 07 terrain image-representative pixel intensity signals, 08 for each of the display screen pixel location signals 09 generated by the second means.
BRIEF DESCRIPTIO~ OF THE DRAWI~GS
11 Figure 1 is an isometric graph 12 illustra~ing the projection of lines of observation 13 From an observer to a display screen plane;
14 Figure lA is a plan view of a portion of the isometric graphic illustration of Figure l;
16 Figure 2 is a further plan view of 17 selected rays of observation of the graphic 18 illustration of Figure l;
19 Figure 3 is a side view of a graph illustrating the projection of lines of observation 21 from an observer through a 28 - 6c -~ . ~
~ ,~

27~

display screen onto a terrain profile;
Figure 4 shows a represent~tion of stored digital terrain map data with a field of view overlay showing a fan of lines of observation from an observer's point of observation;
Figure 5 is a plan view of a pair of fields of view that may emanate from an observer's point of observation on a stored terrain data map and intersect a prescribed image screen plane;
Figure 6 is a schematic block diagram of a real time perspective display s~s~em accordinq to the pr~sent invention;
Figure 7 is a detailed schematic block diagram of angle processor 31 of the system diagram of Figure 6;
Figure 8 is a detailed schematic block diagram of perspective read out address circuit 33 of the system diagram of Figure 6;
Figure 9 shows a pair of adjacent lines along which respective elevation data samples are accessed in accordance with the operation of the perspective readout address circuit of Figure 8;
Figure 10 shows a three dimensional layout of s~ored elevation data sample points and the traversal of an exemplary even line thereover for explaining the operation of the circuit of Figur~ 8;
Figure 11 i~ a detailed diagram of the components makin~
up the elevation interpolator circuitry 53 of Figure 6;

:

~2~7~27~

Figure 12 shows the details of brightness calculation circuit 71 of Figure 6;
Figure 13 is a detailed schematic diagram of the pixel address calculation circuit 63 oE Figure 6;
Figure 14 shows the details of the line buffer circuit~ 81 of the system diagram of Figure 6;
Figure 15 is a block diagram of a ping-pong screen memory arrangement;
Figure 16 shows write address circuitry for accessing the ping-pong screen.memory arrangement of Figure 15 during the wri~e mode;
Figure 17 shows the line-vs.-column arrangement of a screen memory;
Figure 18 is a generalized block diagram of read out address circuitry for accessing the ping-pong screen memory arrangement of Figure 15 during the read mode;
Figure 19 shows the relative rotation of sets of axes for a CRT display screen memory for a roll of the aircraft through an angle ~R;
Figures 20 and 21 are respective graphical illustrations of the vertical and horizontal trigonometric relationships between an observex and a cockpit display screen for a change in pi~ch of ~he aircraft through an angle ~p;
Figure 22 is a schematic block dia~ram of roll correction circuitry ~or reading out data from the screen memories in ~æ ~ 7~7~

accordance with the roll attitude of the aircraft;
Figure 23 is a schematic block diagram of pitch correction circuitry for controlling the accessing of data from the scre~n memories in accordance with the pitch of the aircraft;
Figures 24 and 25 depict respective tables I and II for explaining the operation of the vertical (Y) and horizontal (X) correction circuit components of the pitch correction circuitry of Figure 23; and Figure 26 is a schematic block diagram of a screen limits comparison circuit.

DETAILED DESCRIPTION
~ s mentioned praviously/ pursuant to the present invention, respective rays or lines of observation from the pilot/observer along the terrain beneath and ahead of the air-craft translate into a plurality of vertical lines on a display screen which intersects the rays or lines o observation from the observer. This is illustrated in the three dimensional view in Figure 1 and in the plan view o Figure lA wherein the point of observation lO of an observer is located some elevation or height H above a plane con~aining a line 11 of observation that extends from a poin~ 5 directly beneath the observer 10 and continues ahead of the observer, intersecting a ver~ical plane (screen) 12 at a point S4 and continuing along line ll beyond the scraen 12. Screen 12 may 12~72~;~

be of a substantially rectangular or square shape having sides 6, 7, 8 and 9. A ray of observation from the observer 10 straight down from the observer to point S on the line 11 is denoted by ray line Rl. Additional rays R2, R3...R11 are shown as emanating from the observer at point 10 and extending out ahead of the observer but lying in a common plane with line 11 and ray R1~
From the point of observation of the observer at 10, a line 13 intersects the screen at the center thereof at point 19, perpendicular to the plané of the screen 12. A
line 14, which is parallel to the top edge 7 and the bottom edge 9 of the screen, may be considered to be coincident with the infinite horizon seen by the observer at 10. Line 13 is perpendicular to this line 14 at point 19.
As mentioned above, line 11 follows a path of observation along rays R1, R2, R3~q from a point 5 directly beneath the observer an~ extending out ahead of the observer. ~n the illustration of Figure 1, the first ray from the observer that falls on that line 11 and intersects screen 12 as one proceeds out from the observer is ray R4, which interse~ts the screen at point S4 at the bottom edge of screen 12. It is to be observed that observation line 11 is rotated or displaced by some angle 0 relative to a line 17 which is parallel to line 13, and perpendicularly intersects the screen 12 at point 18.

^10 7~7~

Additionally shown rays of observation R5 r R6 / R7 ~ R8 ~
and R10 are coincident with the plane containing ray R4 and line 11 and each of these rays intersects screen 12 along a vertical line 15 at points S5, S6, S7, S8, S9, and S10. Point S10 denotes the upper edge point of vertical line 15 at edge 7 of screen 12. Within this groupin~, ray R7 intersects the infinite hori.zon line 14 at point S7 and it lies in a plane containing infinite horizon line 14 and the straight-ahead observation line 13, mentioned previously.
It is to be.noted that the rays of observation shown (Ra..~Rll) are simply for purposes of illustration and obviously do not encompass all of the rays of observation that exist, since ~here are an infinite number of such rays which translate into points defining the vertical line 15 on screen 12~ For rays of observation that track line 17 from the observer 10, which is perpendicular to the screen, there is created a corresponding vertical line 21 that is straight ahead of the observer, assuming that the observer is positioned so that the screen is symmetric with respect to his point of observation.
In accordance with the present invention~ the above-described geometric relationship between lines of observation from an observer disposed in front of the screen and fanning out over a terrain proile ahead of the observer 9 and their corresponding vertical lines of observation on tha t7~

screenl is used to locate or identify pixels on a display screen along the respective vertical lines into which elevation points of the terrain profile intersected by lines of observation translate.
To facili~ate an understanding of this concept, attention is directed to Fiyures 2 and 3 which show, on an exaggeratea scale, respective plan and side views of ~he above-mentioned point 10 of observation o the obssrver relative to a terrain 20 that lies ahead of him, the observer looking to the right of point 10 as,viewed in each of the figures. Also shown is the disposition of display screen 12 in front of the pilot observer in an airGraft having a heading along line 17.
For rays of observation that extPnd ahead of the observer/pilot, such as along line 11 that is disposed at some angle ~ relative to the heading angle of line 17, thexe exists an infinite number of rays of observation which intersPct the terrain 20. A number of these rays of observation are specifically identified in Fiyure 3, e.g~ ray Rl which intersects the terrain 20 at the point Pl, and rays such as rays R2, R3 and RH.
Disposed at some distance R in fron~ of the observer is the display screen 12 which is shown as being perpendicular to the direction of heading of the aircraft along the infinite horizon ray ~ (~hich coincides with ray 13 shown in Figur~ 1~. An exa~ination of the abov~-mention~d ray R

~217;~

through the point Pl where it intersects the terrain 20 reveals that the ray intersects the screen 12 at point S1, which, in the illustration shown in Figure 3, is at the bottom edge of the screen. Additional rays R2 and R3 are shown as extending from the point of the observer 10 and intersecting the terrain at points P2 and P3, respectively, and the screen 12 at points S2 and S3, respectively. A r~y R4 is shown as passing over the entirety of the terrain profile and intersecting the upper edge 8 of the screen 12 at a top point S4. Each of the rays Rl, R2, R3 and R4 lies along the observation line 11, namely at some angle ~ relative to the heading of the aircraft along line 17. As a result, each of these rays is coplanar with one ano~her and creates at its respective intersection of the screen 12 a series of points Sl, S2, S3, S4 that form a vertical line in the plane of the screen, as described above with reerence to Figures 1 and lA.
As pointed out above, in Figure 3 line 20 represents the terrain ahead of tha observer. If, instead of actually being able to see the terrain ahead of him, the observer is supplied with information representative of points of elevation of that terrain profile 20 and geographically relate~ to the point of observation 10 of the observer, then it is possible to create on screen 12 a perspective image of what ~he observer would see if he was actually viewing the terrain. It is this data conversion and perspective genera~ion capability of the ~7~

present inven~ion which offers the above-described simplified scheme of creating for the pilot observer a perspective image of the terrain ahead of him from a stored dlgi~al map of elevation points for that terrain.
~ ore specifically, and again referring to Figures 2 and 3, line 20, while representing the terrain, may also be considered to represent a series of elevation points in a plane containing an observation ray or line 11 and the observer at point 10~ From the observer's point of view 10 and extending al~ng line ~ , which is perpendicular to the screen 12, successive distance measurements may be determined.
In Figure 3, it is seen that for point P1 on terrain 20, which is displaced from point 10 by a distance ~Rl along infinite horizon line RH (line ~ being perpendicular to screen 123, there is a differen~ial in elevation ~E1 between the observ~r tELoBS~ at location 10, and the elevation of the terrain 20 at point Pl (ELpl). (A line from the obser~er passing through point Pl along ray Rl intersects the base of the screen at point S1, as noted-previously.l At a further distance R2 along line RH from the observer 10, the difference in elevation of the terrain (here at point P2) xelative to the elevation of the observer at point 10 is ~ E2. lA ray of obserYation which intersects this point P2 along line R~
intersects t~e screen 12 at p~int S2.~ Similarly~ at an even further distance ~ R3 along line R~ from the observer lO~ the ' 1~

. . .

~2~72~

difference in elevation of the terrain relative to the observer (here a~ point P3) is ~ E3. IA ray from the observer R3 which intersects the point P3 intersects the screen 12 at point S3.) According to the present invention, each point of intersection ~Si) on screen 12 of the respective rays of observation Ri that extend from the observer at point 10 along lines of observation, such as line 11~ may be identified by a simple trigonometric relationship. Letting the value "a"
denote the distance bçtween the intersec~ion of the in~inite horizon line R~ which perpendicularly intersects the screen 12 at point SH, then for point Pl, there exists the relationship a/R= ~Elt ~1 or a = R ~E1/ ~ R1. For point P2, there is the relationship b/R = ~E2/ a R1 or b= R a E2/ ~ R2.
Finally, for point P3, ~he intsrsection of point of ray R3 may be determined from the relationship clR = ~ E3/ ~ R3 or a c R ~ 3/ 3 In other words, the separation distances 'la", "b" and "c"
o image poin~s (pixel locations), that would be seen by the observer on the screen 12 if he were viewing points Pl, P2 and P3 on the terrain profile directly ahead and beneath him, may be recreated from data which simply identifles the location of the observer relative to the screen and the terrain (as by way of navigational information and a map of stored elevational values). Since, as described previously, there are available ~11 7~

systems (such as that described in the above copending patent application) which provide for digital storage of elevation terrain data, it is possible, in accordance with the present invention, to take advantage of such systems and generate a perspective image of the terrain by simply knowing the .. -location of the observer 10 and the screen 12 relative tv the geographical positions from which the terrain map data was obtained.
Figure 4 illustrates a representation of digital terrain map data, as stoxed in a scene memory described in the above-mentioned copending application, that may ~e used for generating a plan view of the data on a cockpit display screen. It is to be recalled, however, tha~ the image to be generated in accordance with the present invention is a perspective view of this data; Fi~ure 4 is provided simply to illustrate how the map data stored in the scene memory may be represented in terms of position of the observer and heading of the aircra~t.
As described previously, and as is explained in detail in the above-mentioned copending application, within the digital map generator there may be employed a scene memory which includes, for each memory location, data values of the height or elevation of the points that make up the mapO In the scene mamory map representation of Fisure 4, which may correspond to a 12.8Km X 12.8Km area, there is shown the point of 7~7;~

o~servation 10 of the pilot/observer and the direction or heading of ~he aircraft along a line 170 Line S, which is perpendicular to the headin~ of the aircraft, represents the direction of an artificial sun or shading source rela~ive to the heading of the aircraft. For purposes o~ generating shading in the generated perspective image, the artificial sun direction is always established to be perpendicular to the heading of the aircraft.
From the observer at point 10, there extend a plurality of observation lines~which fan out on either side of the line of heading of the aircraft and prescribe the field of view of tha observer. These are identified in Figure 4 as lines L1, L2~oL480~ which prescribe a field o view of 45 from the . . .
observer 10, 22~ counterclockwise to and 22~ clockwise to heading 17. The 45 field of view shown in Figure 4 is simply for purposes of explaining a practical embodiment and, of course, neither this angle nor the numb~r of observation lines described here is considered to be limitative. Moreover, for purposes o~ illustrating a variation in the field of view, it will be assumed that there are two availahle fields of view:
the 45 field shown in Figure 4, and a 90~ field (not shown in Figure 4).
These two fields of view are shown diagrammatically in Figure 5, wheréin the point of o~servation of the observer Og5 relative to the image screen 12 upon which the perspective . . .

~ ~ ~t7Z ~ ~

display is to be presented is separated from the screen by a distance ~R-D45) and bounded by lines Ll...LN twhich here may equal the number 480 for purposes sf providing an illustration of a practical embodiment). The field of view may be expanded to 90, in which case ~he point of the observer OgO i5 closer to the screen [separation distance R=Dgo). As can be seen from Figure 51 where the field o view is incrPased, ~e.g. 45 to 90) the line5 of observation are, of cour~e, spread out over a greater angle. Since the spacing of the translated vertical lines on the screen 12 is determined by the spacing of the pixels of which the screen is comprisPd, then, for a ixed number of pixels it ca~ be seen that increasing the field of view means that the spacing between lines of observation must be increased; namely, the angle between the respective adjacent lines must be increased. The impact of this spacing will be more clearly understood from the subsequent description from the angle processing portion of the system.
As is shown in Figure 4, along each line of observation Li there are assumed to be 240 points of elevation measurement. (E.G. for Line L1 there are points PlL1....P240L1). These points represent locations or distances along each line Li from the point of the observer 10 to some point Pi at which ~n elevation value Elpi ~rom the digital terrain map may be derived and from which eleva~ion :

~2~7~7~

value and distance to that point from the observer 10 and screen 12 the above-described translation illustrated in Figures 2 and 3 may be accomplishedO
It is also to be observed from Figure 4 that for adjacent odd and even lines, the data elevation points ~ha~ are to be extracted are offset from one another by half the spacing between points along a selected line, so as to provide a smoothing/interlacing of the lmage that is to be generated on the screen from which the perspective view is presented to the observer. Namely po~nts Pll...P1240 of Line Ll (shown as "x'ls~ are offset by half the distance between two consecutive points relative to the spacing of points P21~..P2240, in line Ll (shown as dots1.
Referring now ~o Figure 6, there is shown a schematic block diagram of the real time perspecti~e display according to the present inventionO This system includes an angle processor 31 (to`be described in detail below with reference to Figure 7) which rec~ives data relating to the heading angle of the aircraft over input link 25 and a field of view coafflcient (12 ox 24) over input link 26. From these two pieces o information, angle processor 31 generates signals which represent the successive lines that define the field of view (as shown in Figures 4 and 5 discussed above) as well as artificial sun angle ~.nformation which depend5 upon the heading af the aircraft, as noted pr2viously. Each particular .

:12~7~27'~

scan line Li is identified by a line count signal supplied over line 32 to a pixel address calculation circuit 63 (the details of which are described below with reference to Figure 13). Respective orthogonal increments t~ X,/~Y, wherein Y represents north-up and X east-right) that define the slope of each respec~ive line in the plane of the map are provided over links 34 and 35 to a perspective read out address circuit 33 Ito be described in detail below with reference to Figure 8). The sun angle information (east-west and north-south ~alues) are supplied over respective links 36 and 37 to a brightness calculation circuit 71 (Figure 11, described infra). These sun angle values are employed to provide shading depending upon the slope of the terrain and thereby produce pixel intensity signals for the respective pixels that lie along th~ translated vertical lines of obs~rvation on the respective display screen.
Input links 27 and 28 to the perspective read out address circuit 33 identify, in terms of the data stored in the scene memory 41 (as shown in Figure 4, for example3, the starting point of each line lXs, Y5), namely the point of obser~ation of the observer relative to the terrain within the map data stored in the scene memoxy~ Using these start values ~Xs, Ys) as well as the increment values ~ X and ~Y, perspective re?.dout address circuit 33 produces two sets ~f signals. The first set of signals is supplied over links 42 and 43 and ... .

~Z~7~7~

represents horiæontal and vertical address signals ~vertical identifying the north-up position data in the scene memory3 for locating the elevation sample points Pi along the respective lines Li of the ~ield of view fan within the scene memory 41 (as shown in Figur~ 4, described above). In response to these signals, the scene memory produces a set of four elevation values ~corresponding to a single primary point, and three secondary points offset from the primary point as will be described below with reference to Figure 11) to be coupled to an eleva~ion inter~olator 53 (shown in greater detail in Figure 10, to be described below). The elevation interpolator 53 is also coupled to links 44 and 45 which provide, from the perspective read out circuit 33, fractional values for accurately locating, within ~he terrain map, the above-referenced primary and secondary points, in conjunction with the data values extracted from the scene memory 41.
The elevation interpolator 53 provides, over link 54, elevation values or these primary and secondary points and supplies them to pixel address calculation circuit Ç3. In response to the line count value on link 32 and the elevation values on link 54, pixel address calculation circuit 53 identifies the address of a pixel on the display screen 12, that lies alony a vertical line translated from a respective observation ray from the ob~erver, and which effectively 1~7~7~

intersects a point on the terrain lying along that observation line which is to be recreated on the observer's display screen 12.
Elevation interpolator 53 also supplies respective east-west and north-south slope values on ~inks 61 and 62 for the elevation data points extracted from the scene memory 41 in accordance with the information from the perspective read out address circuit 33, in order to provide shading or intensity level information to brightness calculation circuit 71. Brigh~ness calculation circuit 71, as noted previously, produces a signal on output link 72 representative of the intensity of the respective pixel on the cockpit display screen identi~ied in accordance with the add~ess information produced on link 64 in the pixel address calculation circuit 63~ Each of these address and data signals is coupled to a set of line buffers 81 (shown in greater detail in Figure 14, to be discussed below). This information, as well as a pixel clock signal provided over link 75, is employed by the line buffers 81 to respectively write a vertical line of pixel informa~ion on the observer~s display screen while another line of information is being loaded into a line buffer. The line buffers are coupled over link 82 through a D/A converter 83 and then over link B4 to the scanning circuits of the display. As a conventional CRT
rastor scan sweeps the electron beam horizontally across the ~2~727;~

screen, the C~T itself may be rotated 90, so that the respective vertical lines that are to be recreated on the perspective screen are properly oriented for presentation to the pilot/observer.
Referring now to Figure 7, the details of the angle processor 31 of Figure 6 will be described. As mentioned previously, the angle processor responds to information from the aircraftls navigation unit represen~ative of the heading of the aircraft and a field of view coefficien~ input to produce signals identifying the respective lines of the field of view, namely which line of interest is being processed and the incremental (~X,~ Y) values by way of which that line is defined. The heading information is coupled to the an~le processor over line 25 and it is supplied as one input of an adder 95. The field of view coefficient ~K) is coupled over line 26 to a multiplexer 101. This coefficient may be settable by the observer pilot in the cockpit simply by flipping a switch which provides one of two digitally hardwired values. These values are shown as representative of the numbers ~=24 and K-12, in order to provide proper numerical conversion values for the number of lines employed.
As shown in Figure 4, a 45~ field of view may contain 480 lines. A 90 field of vieW will also contain 480 line3 but the spacing between adjacent lines will be increased (doubled). Namely, for a 90 field of view, the spacing '23 ~7~

between adjacent lines is twice that of a 45 field of view (as diagrammatically shown in Figure 5, described above).
This relationship will become more clear from ~he explanation of the operation of the system to follow.
The angle processor also includes a line counter 94 which has a line clear input link 97, a line load input 91 and a line clock input 92. These inputs are supplied by a suitable processor or timing control circuit, the details of which are not necessary or an understanding of the operation of the present invention and will not be described here. The line counter provides a pair of outputs on links 102 and 32, link 32 providing a line count value, identifying which of the lines of the field of view is bPing processed, while link 102 supplies a value corresponding to twice that line count.
These links are employed for the respective 45 and 90 fields of view one of which is selected by and provided to the pilot/observer. Multiplexer 101 also has a pair of inputs coupled to a link 93 which is a binary bit identifying whether or not the f ield of interest is an odd ield or an even field.
The ou~put of multiplexer 101 is coupled over link 96 ~o be added in adder 95 to data on line 25 representative of the heading angle 25 to produce a value ovex link 103 representative of the angle of the line of interest relative to north-up within the scene memory~ Link 103 addresses a PRO~ 104 which provides foux output signals over links 34-37.
' 24 .

1~7'~
~7 The first pair o signals on links 34 and 35, respectively, represents the incremental values (a X, aY) in horizontal and vertical directions relative to the north-up orientation of the data in ~he scene memory for a respective line of inter~st, which permits successive stored elevation values for points along that line to ~e accessed from the scene memory.
The second pair o~ values supplied ovex lines 36 and 37 represents the direction of the artificial sun along line 12 which is perpendicular to heading angle 17 as shown in Figure 4. The sun d}rection is expressed in terms of a north-south (vertical) component and an east-~est (horizontal) component relative to the north-up position of the data in the scene memory.
In operation, the angle processor initially provides sun angle information from PROM 104 to brightness calculation circuit 71 based upon the heading angle coupled over link 25.
This is accomplished by clearing the line counter 94 and setting the binary state of odd input line 93 to a zero so that the output of multipl.exer 96, regardless o which input is selected, is all zeroes. Thus, the heading angle supplied over link 25 when added to the zero output of the multiplexer in adder 95 is coupled over link 103 to PROM 104. In response to the heading angle data, PROM 104 supplies sun angle vertical and horizontal component values over links 36 and 37, respectively, to the brightness calculation circuit 71. These ~2~7~
values are obtained by cosine and sine values stored in PROM 104 so as to provide east-west (horizontal) and north-south (vertical) values representative of the components that form the artificial sun direction line S as shown in Figure 4.
Once these values are stored in brightness calculation circuit 71, the scanning of the successive lines of a field of view frame commences. For purposes of the present description, it will be assumed that the field of view is 45 as shown in Figure 4, so that the input to mul~iplexer 101 over link 26 corresponds to a selection of coefficient K= ~4.
Thls means that multiplexer 101 will select the Xl input links 3~ and 93. The line clear input over link 97 to line counter 94 is removed and-line counter 94 is loaded by way of link 91 with the value (-240), (corresponding to the posi~ion of line Ll as shown in Figures 4 and 5). Also~ the binary state of line 93 is set to a "1" so as to force the least significant bit of multiplexer 101 to represent an odd line (here the first line Ll). This value is translated at the output fxom multiplexer 101 into.a multibit word that is compatible with the multibit word of heading angle 25 (for example each being a 12 bit word) and represents the effective angle of line L1 relative to north-up, as shown in Figure 4.
PROM 104 responds to this multibit word by identifying the angle of line L1 and produces, on output links 34 and 35, ~6 ~, .

~Z~727~
, respective ~X and ~Y values (namely horizontal and vertical increment component values that define the slope of line L1 on the terra.in map in the scene memory 41)~ These values may be thirteen bit words, twelve bits for identifying the magnitude of the increment and an additional sign bit. These values are supplied to the perspective read out address circuit 33 to be used for accessing elevation values from the scene memory 41.
A line count signal is also produced over link 32 to be coupled to pixel address calculation circuit 630 The line I clock input to counter ~4 on line 92 is then advanced to increment the line counter. The state of line 93 changes to a zero, indicating an even line ~12l, and the output of multiplexer 101 over link 96, which represents the angle of line L2, is added t-o the heading angle 25, and the above operation is repeated for line L2. This process continues until all 480 lines which de~ine the field of view have been pr~cessed. Then the line counter is again cleaxed and the above sequence is repeated.
While the above description was given for a field a view of 45 (K = 24), it is to ~e observed that the same operation takes place for a field of view of 90, except that twice the value of the line counter 94 is supplied over link 102 to multiplexex 101. Moreover, within multiplexer 101 the value on:line 93 is multiplied by two to define the least significant and next least significant bits~ If the lin~ is , ~LZl.7~r;~

an odd line, the least significant bit out of multiplexer 101 is forced to a zero and the next bit is forced to a one~ If the line being processed is an even line, both the least significant bit and the next least significant bit are zeros, As a result, although the spacing between the adjacent vertical lines on the screen 12 remains unchanged, as shown in Figure 5, the spacing between the lines tha~ access elevation points in the scene memory i5 increased by a factor of two, because of the increase in the size (doubling) of the field of view.
Referring now to Figure 8 wherein a schematic block diagram of the perspective read out address circuit 33 is shown, the increment ~ X and ~ Y values as well as signals representative of the half of these values from the angle processor 31 are coupled as inputs on lines 34, 34' and 35, 35' to X multiplexer A 108 and Y multiplexer A 109, respectively. The ~ X/2 and a Y/2 values on lines 34' and 35' are employed to provide for elevation interleaving among the odd and even iines as shown in Figure 4 and as will be described in grea~er detail below. The output of multiplexer A 108 is supplied over link 117 to one input of X
multiplexer B 115. A second input to this multiplexer is the X start value (Xs) supplied over link 27 . representative of the horizontal posit.ion of the pilot/observers aircraft in the geographical map of terrain data stored in the scene memory~

~L7~7~

The vertical component (~s) of this position is supplied as a Y start signal over line 28 to one of the inputs of a Y
multiplexer B 116 to which the output of Y multiplexer A 109 is also supplied over line 118. Selection of the inputs of multiplexers 108 and 109 is controlled by an A select signal on link 111 while the selection of the inputs of X
multiplexer B 115 and Y multiplexer B 116 is controlled by a B
select signal on line 110.
The outputs of multiplexers 115 and 116 are coupled over links 121 and 122, respectively, to X adder 123 and Y
adder 124. Adders 1~3 and 124 add the contents of respective X register 131 and Y register 133 to the values supplied over links 121 and 122, respectively. As will be explained in detail ~elow in conjunction with the description of the operation of the perspective read out address circuit 33, these circuits provide increment values for moving from point to point along an individual line Ll.
The output of X adder 123 is coupled over line 125 to X
register 131 and to an H secondara register 141. Similarly, the output 126 of Y adder 124 is coupled to Y regis~er 133 and to a V secondary register lS0. The contents of X register 131 are coupled over link 142 to X adder 123 and to an H primary register 14~. Similarly, the content5 to Y register 133 are coupled over link 143 to one input of Y adder 124 and to a V
primary register 145. Registers 131 and 133 are .29 ~Z~7~7~

controlled by the processor clock on link 130 while registers 14i, 144, 150 and 145 are controlled by the scene memory clock on link 140. The clearing of X register 131 i5 accomplished by an X clear signal on line 132 while the same holds true for Y register 133 with respect to a Y clear signal on line 134.
Registers 141 and 150 supply to scene memory 41 respective horizontal secondary and vertical secondary addresses over links 154 and 156, while registers 144 and 145 supply to scene memory 41 respective horizontal primary and vertical primary addresses over lines 153 and 155. As will be explained in detail below these pximary and secondary address signals are used to access elevation data values from four adjacent locations on the stored terrain map of elevation data in the scene memory 41, With re~erence now to Figures 9 and 10, the operation of the perspective read out address circuit in Figure 8 will be explained. Figure 9 shows a pair of adjacent lines, namely an even line L and an odd line Lodd and r~spective elevation sample points therealong. The elevation sample points on line L are identified by dots, while those on the odd even line Lodd are identified by "x"s. The startiny point of each line is the point of the pilot observer (Xs,Ys). Figure 10, shows, in an isometric dlagram; a portion of an even line LeVen with graphical n3tations for facilitating an ~30 Z~;~

understanding of the operation of the components of the perspective rPad out address in Figure 8.
At the beginning of a respective line, the contents of the registers are cleared and one of the inputs of X
multiplPxer A 108 and Y multiplexer A 109 will be selected, depending upon whether the line is odd or even. Let it be initially assumed that the line is an even line, so that the A
select line 111 causes the even input (~ X, ~Y) of each of multiplexers 108 and 109 to be coupled to their respective outputs 111 and 118.. This means that the X and Y values from the PROM 104 in the angle processor 33 will be coupled as one of the inputs of each of X multiplexers B 115 and Y
multiplexer B 116. At the start of a line~ the B select line 110 causes the X start value (Xs~ on line 27 and the Y
start value (Ys) on line 28, identifying the position of the observer (Xs,Ys), to be coupled over lines 121 and 122 to X
adder 123 and Y adder 124, respectively. These values are added to the contents of the X register 131 and the Y
register 133, respectively (which at this time are "0", because of the clear signals previously applied thereto).
It is to be noted that the contents of X and Y
registers 131 and 133, respectively, which define the coordinate values of prescribed points in the terrain data map, are deined in terms of those coordlnate values~ namely by way of a whole number and an accompanying fraction~ This 7;~

may be understood with reference to Figure 10 wherein the starting point Xs,Ys is located in a data map quadrant the lower lef~ hand coordinates of which are (Xc,Yc). Using conventional orthogonal graph notation, the quadrants are identified by unit or whole number separations, so that the lo~gest line that may extend in any quadrant is its diagonal, shown as Rs. PROM 104 in the angle processor produces the ~ X and ~Y values in terms of this diagonal Rs and the angle of the line of interest relative to true north, as explained previo~sly. This is denoted in Figure 10 by /~X=Rs sin ~ Y= Rs cos ~O As explained previously, the values which are supplied to the PRO~ 104 are derived at the output o~ adder 95 and depend upon both the heading angle and the contents of line counter 94 in Figure 7.
Returning again to the starting point tXS,Ys~, in terms of the quadrant values this coordinate position is stored in registers 131 and 133 as a whole number Xc, Yc and fractional value HsFRACT, VSFRACT, as mentioned above, and shown in s c E~sFRACT and YS=yc+vsFRAcT The fractional values are supplied over links 44 and 4S (Figure 6) for primary and secondary points from the H and V registers 144, 145, 141 and 150 to the elevations interpolator (Figure 11).
Once the starting point of ~ line has been loaded into ~he X and Y regis~ers 131 and 133, the state of ~he ~ select line 110 is changed, so that the X multiplexer ~ 115 and the Y

~7~

multiplexer ~ 116 couple the outputs of the X
multiplexer A 108 and the Y multiplexer ~ 109, respectively, to lines 121 and 122. ~n ~he presen~ example, it is assumed that the line o~ interest is an even line, as noted above, so that on line 117 the value ~ X will be coupled through multiplexer 115 to line 121. Similarly, the ~Y value will be coupled through multiplexer 109 and multiplexer 116 to line 122. In adder 123, the horizontal component of the starting point which is stored regis.ter 131 is added to the incremental value supplied over lins 121 and again inserted into register 131 at the next process clock signal on line 130. The same holds .true for the starting vertical component. This is,shown in Figure 10 within the segment of cycle 1 as the first point (Xs+ ~ X, ~5~ ~Y3. The X value (Xs+ ~ X) is applied over line 142 and stored in a horizontal (H) primary register 144., while the vertical component or Y
value (Ys+ ~Y) is supplied from Y register 133 over line 143 to be stored in vertical (V) primary register 145. The process clock is advanced at twice the rate of the scene memory clock so that'the output of the X register 131 is loaded into the H primary register 144 and the output of the X adder 123 is loaded into the H secondary register 141 for each scene memory cycle read out clock signal on lin2 140.
The secondary point is def~ned by an additional ~X and 4Y increment produced at the next process clock on line 130.

.

1~7~

In ~igure 10, the primary point on line LeVen is identified by coordinates ~x',Y') while the secondary point is identi~ied by coordinates (X'', Y''). This point is also identified in Figure 9 within cycle l (a single scene memory cycle or two process clock signals~ by values (Xs~2 ax, YS+~ ~Y~ O For the next scene memory clock siynal cycle, the next two points are generated, and so on, as shown in Figure 9 for the even line, so a~ to step through successive points along the even lineO
This proces5 continues until address values for a total of 240 points per line have'been generated.
For an odd line, the displacement from the starting point is offset relative to that of the even line by a value of ( ~X/2, ~ - ~Y/2? by initially selecting the /~X/2 and /~Y/2 inputs of multiplexers 108 and 109 for the first cycle and then switching back to the a x and ~Y values for the secondary point of the first cycle and all subsequen~ points~
This causes an odd line to proceed along the points that are disposed substantially half way between or interlaced between the points of the adjacent even lines, as shown at the "X"s along the odd line Lodd of Figure g. Thus, for a first processing cycle r namely a first cycle for the scene memory clock which encompasses two processing clock times, at the start of the odd line, there will ~e produced a primary location IX ~ ~X/2, Ys~ aYl2) and a secondary point on the same line spaced a distance Rs therefrom o (Xs~3~X/2 ~2~7Z7;~

Ys+3 /:!~Y/2).
As a result of the foregoing spacing between points on the odd and the even lines, for any cycle, a pair of primary and secondary displacement values are produced as a horizontal primary displacement ~Xs* 14N-3) ~X/2) and a horizontal secondary displacement (XS~(4~-l) ~ X/2) and a vertical primary displacement (Ys+ (4N-3) ~ Y/2) and vertical secondary displacement ~YS(4N~ Y/2). For an even line, during any cycle N, for horizontal increment there are a pair of primary, secondary displaceme~ts ~X5+ (2N~ X) and (Xs+2N a x3, respectively~ and a pair of primary, secondary vertical displacements as (Ys+ (2N-l) ~ Y, ~Ys +2N ~ Y) respec~ively.
As noted previously, each of the primary and secondary values, which are to be employed as address signals for addressing the scene memory 41 and supplying values for the elevation interpolator 53, includes a whole number or quadrant identiier number and a fractional value. Through the respective horizontal and vertical primary and secondary values, namely the X, X', Y and Y' values shown in Figure lO, it is po~sible to acce~s from the scene memory four points, two of which (X' ,Y' ) and (X' ', Y' ' ) lie on the line of interest and the other two of which are orthogonally disposed (in terms of their X and Y values) with respect to these points. In this fashion, the point having coordinates {X1,Y'3, namely point A as shown in Figure lO, is identified 7~

as a primary point on the line LeVen and the additional poin~s B, C and D are identified as secondary points, secondary point D being on the line at coordinates ~X'', Y "~ The manner in which these values are employed for further address and brightness calculation processing will be described below with reference to the description of the elevation interpolator 53 in Figure 11 and the brightness calculation cixcuit 71 in Figure 12.
The elevation interpolatoxt a schematic block diagram of which is shown in Fi~ure 11, performs the function of determining or interpolating ~he eleva~ion of the terrain data map points alony each respective line of observation within the field of view of the pilot/observer. In order to facilitate an understanding of a functional grouping and operation of the components of the elevation interpolator shown in Figure 11, the description to follow will make reference to both Figure 10, discussed briefly above, and Figure 11 itself. For purposes of explaining the construction and operation of the elevation interpolator, an exemplary line of observation L v n~ a portion of which is shown in Figure 10, will be referenced. ~ In terms of the coordinates of the terrain data map, line LeV~n has a positive slope and lines wherein the Y axis lies in a north-south direction and the X axis lies in a east-west direc~ion)~ Lines 171 and 181, which control the sel~ction of the inputs of a pair of b 36 .
... .

-~17~

multiplexers 172 and 182, respectively, provide binary ("1" or "0") control signals to multiplexers 172 and 182 and represent the polarity of the heading of th aircraft in terms o their signs in the definition in the X, Y coordinate syst~m. their signs. For the line LeVen in Figure 10, the polarity is positive and the aircraft is assumed ~o be heading in a northeasternly direction so that the heading of the aircraft has a positive polarity. If the aircraft were heading in a southwesternly direction along the line LeVen, so that the observe~'s location (Xs, Ys) were at the upper right hand end of the line, the incremental values along the line would be negative X and negative Y values, so that lines 171 and 181 would cause the multiplexers 172 and 182 to select complementary values of the horizontal and vertical fraction values for each of the respective points being processed.
A register 195 is coupled to receive from scene memory 41, via lines 191-194, the stored elevation values for the respective quadrants containing points A, ~, C and D shown in Figure 10. Point A is considered a primaxy point so that the elevation data values are respectively EA for point A, EB for point B, EC for point C ànd ED for point D. These elevation values are loaded into register 195 in accordance with the scene memory clock on line 140~ Output link ~01, which represents the EA data value, is coupled to on~ input of a north-south multiplexer 205 and an east-west 7;2 ~

multlplexer 207. The EB data value is coupled over link 202 to the other input of nor~h-south multiplexer 205 and to one input of east-west multiplexer 208. Output 203 representative of the EC data value is coupled to one input of input north-south multiplexer 206 and to a second input of east-west multiplexer 207. Finally, output link 204 which represents the ED data value stored in register 195 is coupled to a second input of north-south multiplexer 206 and a second input of east-west multlplexer 208.
Each of multiplexers 205 and 206 i9 controlled by a north-south select signal on line 185, whereas each of east-west multiplexers 207 and ~08 is controlled by an east-west select signal on line 186. The output of each of north-south multiplexers 205 and 206 is coupled o~er respective links 211 and 212 to a north~south adder circuit 215 whereas the outputs of 213 and 214 of east~west multiplexers 207 and 208, respectively, are supplied as inputs to adder 216. These multiplexers and adders operate to provide elevation slope values for the respective elevation points along an observation line, such as line Le~en f Figure 10.
In order to determine the east-west slope of point A
along line Le~en, the EA and EB values and the EC and ED
values are respectively subtract~d from one another. For thi~
purpose, the east west select line 186 causes multiplexer 207 ~7~7~

to couple input link 201 trepresentative of the EA data value) to its output 213 and multiplexer 208 to couple its input line 202 (representative of the EB data value) to its output line 214. These values are subtracted in circuit 216 to provide the value quantity (EB-EA) at the output line 62 representatiYe of *he east-west elevat.ion slope. For the north-south elevation slope, select line 185 causes north-sou~h multiplexer 205 to couple link 201 to output 211 and multiplexer 206 tv couple link 203 to output 212, so that the EA and EC vaiues are subtracted from one another as a first eleva~ion slope value relative to point A which is supplied ov~r link 61 as the north-south slope point A.
A similar operation takes place with respect to the secondary point D on line LeVen whereby, for the east-wes~
slope, the EC and ED values are subtracted from one another and, for the north-south 510pe, the ED and EB values are subtracted from one another. The selection of these points will depend upon the direction of the heading of the aircraft along line LeVen~ For the pre~ent example, it assumed that the aircraft is moving in a northeasternly directly, so that the east-west slope is determined by th~ lEB-EA) value and a north-south slope is determined by the (EC-EA) valu~. Thus, the north-south slope applied over line 61 will be supplied as one input to a mult~plier 221 to be multiplied by the fractional value which is coupled through multiplexer 172 over .

~2~L7~7~

link 174. This produces the value (EC-EA) x VFRACT on output line 233. Similarly, east-west multiplier 222 multiplies khe value (EB~EA) supplied by east-west adder 216 by the fractional value supplied over link 184 through multiplexer 182 (namely the HFRACT value~ to produce the quantity (EB-EA) xHFRAcT. This value, on line 234, is added to the value on line 233 by fractional adder circuit 223 to produce the value:

I FRACT x (EC-~A) ~- HFRACT x (EB-EA) If the aircraft heading were in a southwesternly direct along line L~en, fractional adder would produce the value ( FRACT X (ED-~g) + HFRACT x (ED-EC)) This ractional sum is applied over line 232 to an adder circ~it 235. The other input of adder circuit 235 is supplied over link 231 from a multiplexer 224. Multiplexer 224 is coupled ~o receive the primary (E~ and secondary (ED) ele~ration ~ralues on lines 226 and 227, respectively. Namely, link 226 provides the elevation data (EA~ value for point A
and line 227 provide~ the elevation data value for poin~ D.
Which of:these values is selected is determined by a primary/secondary selection line 225. For point A of interest ~L2~l7Z~

here, the primary elevation value (EA) will be coupled over line 231 to adder 235. Adder 235 adds the elevation value to the slope vaLues determined from the elevation data values from points A, B, C and ~, discussed previously. The fractional values ~or the location of the points in the respective quadrants of the ~errain map are used to interpolate the slope of the terrain at these points. Namely, the fractional values represent the relative displacement of the primary and secondary points from those positions in the terrain data map at which the elevation values are known and stored. Points in between are simply interpolated on the basis of coordinate ofset as shown in Figure 10, e~plained previously. Accordingly, the output of the interpolation adder circuit 235 repre~ents an interpolated value of the elevation at A, which is offset by H'FRACT and V'FRACT from the coordinate po~ition (Xc~l, YC~l) at which th~ elevation value of EA is known and stored in scene memory 41.
The output of interpolation adder 235 is coupled over link 243 to one end of input multiplexer 242~ Multiplexer 242 has a second input coupled over link 245 from a delay circuit 244 the input of which is coupled ~o link 231 from the multiplexer 224. Multiplexer 242 is controlled by an interpolate select line 241 and has its output coupled to output link 54 to supply an elevation siynal for each point of interest. The purpose o the multiplexer 246 is to permit the ~Z~7~7~

bypassing of the interpola~ion circuitxy of the elevation interpolator and simply couple the elevation values that are derived from the scene memory directly to the output link 54 without interpolation that is achieved by the slope interpolation process described above.
Referring now to Figure 12, wherein the brightness calculation circuit 71 is shown, the north-south and east-west slope values, ~s i5 applied over line 61 and 62, respectively, from the elevation interpolator of Figure 11, are applied to respective inputs of multlpliers 25I and 252. The east-west, north-south sun angle values from PROM 104 within the angle processor are respectively coupled over lines 36 and 37 as second inputs to the multipliers 251 and 252. Multipliers 251 and 252 are operated by the processor clock on line 130 and produce composite values for the east-west and north-south shading of a pixel on lines 253 and 254. These two values are addéd together in adder 255, the output of which is coupled over line 256 to a brightness PROM 257. This averaged value addresses brightness PROM 257 to extract therefrom a pixel intensi~y value for the v~rtical line on the display screen corresponding to the line of observation being processed.
Since the process clock operates at twice the requency of the scene memory clock two ~alues are obtained for every memory access, one for the primary elevation point A and the other for the secondary elevation point D. These values are coupled ~42 :~23L7Z7~

over link 27 to the line buffers 81 (Figure 14).
The details of the pixel address calculation circuit 63 are shown in schematic block diagram form in Figure 13. The elevation values from multiplexer 246 over line 54 are coupled as one input to a subtraction circuit 261. The second input over line 263 is derived from the aircraft's altimeter and represents the elevation of the aircraft, namely the observer.
These values are subtracted from one another to obtain an elevation differential value ~ E on line 272 which is coupled as one input to a mu~tiplier 282. The pix~l address calculation circuit also includes a line point counter 262 which i5 incremented by processor clock 130 and loaded under the control of a signal on line 266. A linP count value is supplied over line 32.. The output of the counter is coupled over line 270 to a PROM 271~ Counter 2fi2 and registers within the multiplier 274 are cleared by a signal on line 264.
PROM 271 generates values corresponding to the incremental steps along each scan line, namely corresponding to the spacing Rs, mentioned previously, in accordance with the selected field of view~ As shown in Figure 5, the spacing between elevation sample points from the data map will be greater fcr a ield of view of 45 than or 90 since the observer is effectively further away from the screen. For a 45 versus 90 selectivity, the spacings between points on the ~2~7~7,'~

lines for the 45 field of view ~re twice the separation between points for a 90 field of view. For this reason, the K=24/12 value is applied over line 26 to PR0~1 271. The output of PROM 271 is coupled over link 273 to X and Y registers contained within multiplier 274.
At the beginning of each line, the value of the line count is loaded into the line point counter 262. This line count is obtained from the angle processor 31 (Figure 7) and identifies which observation line Li from the observer ~o the screen is being proce~sed. Once this value has been loaded into the line point counter 262, PROM 271 now has all of the information it needs to.produce a proper 1/~ R value indicative of ~he incremental steps along a line of observation from the observer.
A~ will be recalled from the previous discussion of the translation or conversion scheme according to the present invention referencing Figures 2 and 3, a pixel location along the vertical line of interest on the display screen, such as tha~ for ray R1 ~Figure 3)~ is proportional to the differential elevation ~Ei between the observer ELobS and the point Pi on the terrain effectively intersected by that ray and the distance /~Ri between the observer and the point on the terrain along a line perpendicular to the screen, namely line ~ directed at the infinite horizon. PRO~ 271 generates, for each respec~ive point Pi along the observation lin~ Li~

7~

the proper differential separation value, namely the value Ri (here actually the reciprocal of that because of the counting of units or unit separation spacings Rs from the observer to the screen). This value will remain the same for each elevation sampling point aLong a particular ray, it only changes when the scan proceeds to the next observation ray.
Via line 264, the line point coun~er 262 is eleared and the 1l~ R unit value is loaded into the X register of multiplier 274. PROM ~71 also outputs the value K/N, where K
depends upon the field of view chosen as indicated by the value on line 26. ~t each processor clock signal on line 130, the line point counter 262 is incremented as the system advances from one eleva~ion point Pi to the next along the line of observation Li and the line point count~r is incremented. The output of PROM 271 is K/N for an even field and K/(N-0.5) for an odd field, where N is the count value in the line point counter 262. The output of ~he PROM is supplied to X and Y inputs of mult~plier 274. For an even line, its output is X/N ~R, whereas for an odd line, its output is K/ (N-O . 5 ) ~ R) .
Multiplier 282 multiplies this value on line 281 by the elevakion differential ~ Ei on line 272 so as to locate a pixel position Si on the vertical line on screen 12 that is ir.tersected by the ray of observation Ri that passes through khe point Pi on the terrain map at the elevation sample point .

~7Z7~

as shown in Figure 3, described above.
As shown in Figure 3, at the beginning of the line scan, where the points along the line of observation on the terrain map are very close to the observer, (e.g. points P'l, P'2, P'3,P'~) an observation ray (e.g. R'1, R'2, R'3, R'~) passing through these points may fall beneath the screen (depending upon the altitude of the aircraft). Since such points do not correspond to any of the display pixels on the display screen, they obviously are not to be reproduced for the pilot observer. To take this into account, the output of multiplier 283 is supplied to a subtractor 285 which substracts a counter position value supplied over line 284.
This differential value is supplied over line 286 to a comparator 29i and a register 291. The output of register 291 is coupled over line 64 as a line buffer address signal to the line buffer circuitry of Figure 14, to provide a pixel address corresponding to the ray of interest to the observer on the display screen 12. Line 64 i5 also coupled to a second input to a comparator 292l the output of which is coupled over line 293 to register 291. By subtracting the ou~put of multiplier 282 on line 283 from the counter position 284, the effect is to move the infinite horizon position on the display screen relative to the obs~rver. The output of subtractor 285 i8 tested to see whether the pixel position of interest falls within the limits of the display screen, namely whe~her it is . '46 .

~2~L7i~7~

less than ~he value 0, corresponding to ~he bot~om of the screen, or greater than the value 511, corresponding to the 512th pixel at the top of the screen. If other words, if the ray falls off the screen i~ is cer~ainly not going to be recognized by the pixel address circuitry for the line buffers. Instead, the last value that is s~ored in ~he register 291 is retained. If the output of subtractor 285 falls within the limit of the screen, namely between zero and 511, that value is loaded into register 291, as a result of a comparison hetween the current value of register 291 and the current output value of subtractor 285 as determined by comparator-292. The purpose of comparator 292 i5 to prevent the storage of a brightness value for a terrain location that cannot be seen by the observer but which is is identified during the line scan.
For example, referring again to Figure 3, as the line point scan proceeds to point P7 it will access an elevation value thereor. However, from the observer's s~andpoint, because a previous point P6 has a higher elevation than poin~ P7, the terrain at P7 is hidden from the observer's view. What will be seen by the observer is whatever appears along ray R7 (R6) which passes through each of points P6 and P7. However, sinoe the ray R7 intercepts the terrain at point P6 first, the brightness value for the pixel at tha~ location, namely poin~ P6, will b~ imaged on the screen at the ~2~7Z7~

corresponding intersection pixel. The pixëls along the vertical line on the screen are sequentially addressed from the ~ottom to the top of the screen, and once a point is passed there is no return to it. Accordlngly, when the elevation interpolator extracts a new elevatlon value that will intercept a previously observed ray, since the pixel point that is intercepted by that ray has already been processed and its value stored in ~he line buffer, the new value is ignored. In other words, comparator 292 causes the contents of the addre,ss register 291 to be stored with values which represent successively advanced points along the vertical line on the screen, which may or may not be equated with the successively advanced points along a line of observation from thé observer on the terrain map.
Referring ~o Figure 14, the line buffer is shown as being comprised essentially of a pair of line memories 313 and 314, which are operated in ping-pong fashion, namely one is receiving new data from a new scan line while the o~her is being read out and presented to the scanning circuitry of the display screen. To control the storage and readou~ of the ping-pong memories 313 and 314, a read counter 302 is employed~ Initially, the read counter is cleared via line 301. The output of read counter 302 is coupled over link 307 to a pair of address multipl~xers 306 and 311. A
second input of each of these multiplexers is derived from line buffer address link 64 from the pixel address calculation circuitry, discussed above in conjunction with Figure 13.
As poin~ed out previously, in accordance wi~h ~he simplified version of the invention, a standaxd CRT display is physically installed in the cockpit instrument panel with a 90 rotation, so that 512 horizontal scan lines are effectively rotated into a vertical plane and will correspond to the vertical lines of observation by the observer. When the first line of a field is to be processed, the A~B select line 305 chuse.s addr~ss multiplexer 306 to couple the line buffe~ address on link 64 to its output 312 and the address multiplexer 311 to couple the read ~ounter output on line 307 to its output 315. Data which is supplied to the data multiplexer 331 is thus written into one of the memories 313 and 314 as designated by the address conten~s of links 312 and 315 .
Data multiplexer 331 is coupled to receive the brightness value output from the brightness calculation circuit 71 on link 72 and a sky code on link 321. The sky code consists o~
all ones and data multiplexer 331 couples this input to its output on line 332 when the observation line of interest goes above the terrain~ The ~rightness values themselves may be encoded into 16 different gray levels, two of which are considered to be illegal, the above mentioned sky code (all "l"sl and an all "O"s value. These values are inhibited ~LZ~7~7;~

from being generated by PROM 257 by ~he brightness calculation circuit 71. Each of memories 313 and 314 is cleared before a write cycle into the line buffers and the display is begun.
At the start of the processing cycle for a line of the field of interest, such as line Ll (Figure 4), the first line of the odd field, the line buffer address is set to zero and the first brightness value is then written into the address zero of the memory 313. Namely, the A/B select line 305 causes multiplexer 306 to couple the æero contents of link 64 over link 312 ~o the address inpu~ of memory 313. The fir.s~
brightness value for that pixel positio~ is coupled through data multiplexer 331 on line 332 and stored in memory 313. As legitimate line buffer addresses are calculated by the pixel -address calculation circuit 63 and data values from the brightness calculation circuit 71 are derived, the data values corresponding to the respective points on the screen, namely the respective rays of observation, are written into memory 313. This operation continues until all 240 points ' along the observation line of interest have been processed.
At the end of a observa~ion processing cycle, the sky code, namely all "l"s supplied over link 321, is coupled through multiplexer 331 and written into the last valid address calculated by the pixel address calculation circuit ~hrough 63~
At the start of the next line of the field, such as SO

~LZ~L7~

line L3 of the odd field, the data that was previously stored in memory 313 is read out, as the state of the A/B select line 305 cha~ges and the second line of data is writ~en into memory 314. The data that is read out is coupl~d over line 333 to a comparator 334 and one input of a mllltiplexer register 341. The content~; of the read counter 302 are applied through multiplexer 306 to memory 313 to cause the sequential read out of ~he contents of memory 313. Read counter is incremented from a value of zero, namely its initially cleareq value to a value of 511, 50 as to step through the 512 pixel position along the line of data that is being supplied to the display. As data is read out of the memory 313, it is examined by comparator 334 to de~ermine whether or not it is all "O"s. If ~he data is all "O"s (namely a black levell, then the previous pixel value is multiplexed by into the multiplex register 341 via line 342.
If the data lS not all "O"s, the multiplexer register 341 is loaded with the new data value on line 333. This pixel data value is supplied to the display for presentation to the observer at its respective ~can point on the vertical line of intere~t on the screen 12.
After the above process is completed, the line buffer proceeds to write in the next line of data as it reads out the previous line of data and the roles of the memories 313 and 314 are reversed. Thi~ process continues until all the lines ~2~l7~

of the field have been sequentially written into the line buffers and read out and displayed on the vertical lines of the screen to the pilot/ob~erver. Then the other (e.g. even) field is processed so th,at upon completion of processing both the odd and even fields, a total of 480 lines of observation of the pilot/obserYerls ~ield of view will have been displayed on ~he cockpit CRT display.
In the above description of the basic embodiment of the present i'nvention, the scanning of the pixels of the coc~pit CRT display is a~complished by rotating ~he CRT display 90 f so that simplified line buffer circuitry may be employed in conjunction with the CRT's horlzontal scanning circuitry to generate the successive vertical display lines (of observation) across the face of the pilot/observer's display screen, thereby achieving a considerably simplified hardware configuration. As mentioned above, however, in place of using line buffer circuitry, which requires physical rotation of the cathode-ray tube, a conventional ping-pong field screen memory arrangement may be used. This ping~pong screen arrangement provides for storage of the entirety of the respective odd and evan fields that make up a field of view, which data may then be selectively acce~sed to allow for real time changes in at~i~ude of the aircraft, specifically, changes in pitch and roll oP the aircraf~,.
Such a ping-pong screen m~mory arrangement i9 depicted in ~7~7~

Figure 15 as consisting of a ping memory 402, which stores data values, for all of the lines of o~servation of an even field within the field of view of the observer, while the pong memory stores the data values for the odd fieldO The memories operate in alternate fashion such that during an even field writing operation, namely during the time the data points for an even field are being processed through the line buffer circuitry of Figure 14 and written into ping memory 402, previously stored values for the odd field are being read out of the pong memory 40i for display on the pilot/observer's cockpit CRT. This particular mode is schematically shown in the ~witch configuration o~ Figure 15 wherein the line buf~er data on line 333, correspo~ding to the outputs of respective A memory 313 and B memory 314 of the line buffer circuitry shown in ~igure 14, is applied through a switch 403 which is selectively applied to the input of one of the ping memory 4 0 2 and the pong memory 401. Data that is read out from these memories is selectively coupled via switch 404 to link 82 to be converted into analog form by D-A converter 83 and supplied over link 84 to the cockpit display CRT. The positions of switches 403 and 404 are reversed from that shown in Figure lS
when an odd ield is to be written into memory 401 and a previously written even field is to be read out of ping memory 402 for displ~
The circuitry for controlling the selective addre~sing to ~2~7~

th~se memories for writing data from the line buffers 313 and 314 into the ping-pong memories 402 and 401 is shown in Figure 16. Figure 17 shows the matrix format of an individual one of the ping-pong memories in terms of the rows and columns of data values that are stored therein and accessed by the addressing circuitry to be described below.
As noted above, in accordance wi~h the basic embodiment of *he invention wherein the cockpit CRT is physically rotated 90 in its ins~alled position in the cockpit instrument panel, the outputs of the li,ne b-~ffers 313 and 314 of the line buffer circuitry of Figure 14 are effectively directly coupled to the kotated) hoxizontal scanning circuitry of the cockpi~ CRT
unit as each line of data in an individual one of the line buffers corresponds to one of the horizontal (ro~ated to the vertical) scan lines of the CRT ~which has been physically rotated 90~. When using ~he ping~pong screen memory arrangement of Figure 15, however, the cockpit CRT display is no longer rotated 90, but, instead, is employed in its normal upright physical orientation, so that the normal sweep of the electron beam acro~s the face of the CRT produces a pattern of successive horizontal line sweeps (conventionally including 480 display lines and 45 retrace line intervals~. In accordance with the present modification of the basic embodiment of the in~ention, the contents of each ping-pong CRT screen memory which represent a complete field (odd or ~2~7Z7~

even) of data, it is possible to orient (in real time~ the presentation of that data on the cockpit display in accordance with any attitude of the aircraft. This is achieved by a memory read out control su~sy~tem that responds to attitude sensor signals from the aircraft's guidance system to controllably address the contents of the ping-pong screen memories, so that the pixel data extracted therefrom is presented on the cockpit display in a manner which reflects the attitude of the aircraft.
Fox example,.for ~ co~nter-clockwise roll of the aircraft, the scene that would be observed.by the pilot would roll conversely (clockwise) and the data accessing cixcuitry (for read out) which selectively addresses the contents of the ping-pong screen memories does so in a manner to place the respective pixel data that is extracted from the ping-pong screen memories into locations on the CRT display so that the perspective view that is presented to the pilot/observer is in accordance with the attitude of the aircraft. Similarly for a change in pitch of the aircrat ~e.g. nose-up), the data acce~sing circuitry reads out the contents of the ping-pong screen memories to cGunt a vertical shift (e.g. down and translation for a nose-up attitude) of the same data presented on the face of the cockpit CRT display.
The writing of the data from the line memories 313 and 314 of the line buffer (Figure 141 is straightforward. The .

:~2~7~7~

write address circuitry shown in Figure 16 includes a row address generator (counter) 410 and a column address generator (counter~ 411. Each of these coun~ers responds to clock signals supplied thereto and generates a digital output code identifying an address in the screen memory for the field of interest (either screen memory 402 or screen memory 401) over lines 425 and 426 for identifying the memory location into which line buffer data supplied from link 333 and switch 403 is to be written. Row address counter 410 counts line buffer read clock signals over link 422 and successively increments the code on line 425 representative of the row within the screen memory of interes~ being addressed. Counter 410 is reset by a line reset signal on line 421. Similarly, column address counter 411 count~ line clock signals on line 424 and supplies a code indicative of the count value therein over line 426 to identify a column address in the screen memory of interest being accessed. Counter 411 is reset by field reset signal on line 423.
The operation of the write address circuitry will be best understood by refsrence to Figure 17 which is a sche~atic map of the contents of an individual screen memory into which the respective lines of data from the line buffer circuitry (Figure 14) are written by the write address circuitry of Figure 16 and from which a previously stored field of data is read out in accordance with the attitude of the aircraft by ~6 7' the read address circuitry to be described subsequently.
In the description to follow, reference will be made to the writing of an even field of data being written into ping screen memory 402. of course, effectively the same vperation takes place with the pong screen memory 401 except that the lines being written therein are odd lines as o~posed to the even lines descrihed here. For an even field of a frame, ping-pong line memories 313 and 314 will be successively loaded with even lines ~corres~onding to vertical lines of observation ~rom.the pilot/observer) of data.. As shown in Figure 17, the first line of data of an ev~n field corresponds to line 2 which is t~ be loaded into screen memory 40~ such tha~ it is allgned with column 0 and the respectlve data values therein are stored at row posi~ions 0-239. It is to be again recalled that in accordance with the present embodiment, the cockpit CRT is not physically rotated 90, but is installed in the instxument panel of the aircra~t in its normal upright position. This means that each field of data will contain 240 horizontal lines, so that only 240 data values will be stored ~or an individual line. In the description of the basic embodiment, a much larger number of data values was stored in line buffers 313 and 314 ( ths number 512 being given a~ an ex.emplary parametric value above1. When employing the screen memories 402 and 401, accordingly, the size of the mem~ries 313 and 314 is reduced ~23~Z~

to match the number of rows of data that are stored in the screen memories (here 240 rows per field) so as to match the horizontal line scanning capacity of the cockpit CRT~ Thus, the size of the line buffers 313 and 314 will be reduced such that each contains 240 memory locations. Reducing the size of the memory locations also requires a corresponding change in the value of R described above as part of the algorithm for the addressing scheme for these memory locations. This is a simple mathematical exercis~ so that ~he resulting field of view encompasses 240~screen memory locations instead of the number 512, given as the example above. Of course, neither of these numbers is limitative and the present invention is applicable to any size of display screen. For example, if the display screen were an extremely high resolution CRT unit containing on the order of 2,000 horizontal line sweeps, then the size of the line buffers 313 and 314 would each have to be increased to a capacity on the order of l,000 data locations and the value o K would, correspondingly, be adjusted to take this into account. In any event, ~or purposes of the present description, it will be assumed that a conventio~al 525 horizontal line sweep CRT display unit is employedf so that each screen memory contains a matrix of 240 x 240 memory locations.
The operation of the write address circuitry of Figure 16 proceeds such that the contents of the line buffers 313 and 7~

314 (Figure 14) are loaded into successive columns in the screen memory, as noted above. At the beginning of a field, each of the column and ro~ counters 410 and 411 is rPset, the field reset signal on line 423 resetting the column address counter 411 to 0 and the line address signal on line 421 resetting the row address coun~er 410 to 0. When the fir~t line of data for the even field (corresponding to observation line 2) fxom the A memory 313 of the line buffer circuitry is coupled over line 313, the line buff4r read clock which causes the read out o~ the ~ontents of ~he 240 memory locations of line buffer 313 is also applied to row addxess counter 410.
This means ~hat the respective data valu~s for the 240 memory locations from buffer 313 are loaded into successive row locations of column 0 as row address counter 410 is successively incremented by the line buffer read clock 422 and generates successive row address signals on lin~ 425. The column address signal on link 426 remains the same during this line of data since it is only incremen~ed by ~he line clock.
At th~ end of the first line, the line clock is incremented on link 424 to advance the column address counter 411 to column 1 while the line reset signal on line 421 ~rolling over after the value 239) resets row address counter 410 and the contents of B memory 314, representing ob~erYation line number 4, or the eecond line of ~he even field of data, is written into the successive ro~ locations of .:

7~

the second column (column 1) of the memory locations of screen memory 402. The next line clock increments column address counter 411 to the third column (column 2~ of screen memory 402, while the line reset rollover signal on line 421 resets row address counter 410 to row address value 0. ~s the line bufer read clock signals are applied over line 422, the third even line of data, which has been stored in A memory 313 in the line buffer (corresponding to vertical line of observation number 6 or the third line of data of an even field) is written into the third column of memory 402. The above process is repeated until, finally, the last even line (corresponding to observation line 480~ is read out of buffer 314 in the line buffer circuitry and written into column 239 of memory 402.
At the completion of loading of the ping memory 402, switches 403 and 404 reverse positions and the field reset signal 423 resets the column address counter 411 back to the first column (column 0) and the above process is carried out for the odd field of data. During this time, contents of the ping memory 402 are read out under the control of read address circuitry to be described below in accordance with the attitude of the aircraft. After the contents of the ping memory 402 have been read out, the positions of switches 403 and 404 are again reversed and ping memory 402 is loaded while pong memory 401 is read out.

~727~

Figure 1~ shows a generalized block diagram of the read address generation circuitry to be described in detail below.
In its generalized form of Figure 18, the read ~ddress generation circuitry 430 is shown as having its outputs supplied to the row and column address links 425 and 426 for ping-pong memories 402 and 401, as described above in connection with the write address circuitry Figure 16. During the write mode, the values of the address signals on links 425 and 426 are generated in response to the line buffer clock signals and field id~ntification signals described above in conjunction with Figures 1-14. For reading out the contents of the screen memory, however, the address signals are generated in response to attitude signals from the aircraft's guidance system and in accordance with signals from the scanning circui~ry of the display CRT. The ver~ical and hori20ntal scanning signals from the CRT scanning ci.rcuitry are supplied over links 441 and 442 to read address generator 430. Link 443 provides a signal indicative of the pitch of the aircraft, while link 444 provides a signal indicative of the roll of the aircraft, each of these links carrying a digital code representative of the angle the aircraft makes with respect to~level flight. This will be described in more detail below.
Before describing in. detail the components of the read address generation circuitry, the functional purpose of this .

~2~7~7~

circuitry for controlling the position of the respective data on the display CRT will be explained. Basically, the circuitry operates to map the coordinates of the CRT display into the coordinates of the ping-pong field memories 402 and 401 in response to various roll and pitch maneuvers of the aircraft. Roll of the perspective image is carried out by rotation of the coordinates of the CRT display about an origin at the center of the display screen. The respective X and Y
axes of the attitude of the aircraft under the influence of a roll maneuver form an angle data aR with respect to the axes of the aircraft in level flight corresponding to the ax~s XcYc of the CRT display, as shown in Figure 19. In Figure 19, the rotated set axes XR, YR represents the rotation of the image to be seen ~y the pilot/observer for a clockwise roll of the aircraft of an angle data R relative to level flight. Namely, as mentioned above, the image always rolls in a direction opposite to that of the roll of the aircraft. To locate a position in a screen memory which corresponds to a roll posltion of the CRT display screen; the standard equations for rotation are employed as follows.

R Yc sin ~R + Xc cos ~R

R Yc cos QR - XC sin ~R.
As mentioned above, the present invention also takes into account the pitch of the aircraft. For a change in attitude creating a pitch from level flight ~nose-up or nose-down3 r ~ 62 :~Z~27~

there may be defined a pitch angle ~p that will be used to modify the rotated coordinates XR and YR defined above to provide a set of address values XA and YA whose origin is located at column row positions 0, 0 in the screen memory.
The manner in which ~he pitch values are generated will be described subsequently.
Since the origin of the C~T field memory is at the center of screen, it is located at column/row coordinates ~120, 120).
This means that a value of +120 has to be added to the pitch txanslated coordinates XA and YA for proper transformation from the CRT display screen to the field memory. Thus, considering the~total translation that must be carried out~ to compensate for roll and pitch of the aircraft and the fact that aircraft attitude is defined relative to the center of the display, then one may derive the following expressions:
Xc -~roll -~XR -~pitch ~XA -~+120 -~XM
Yc -~roll ~YR -~pitch -~YA -b+120 -~YM.
As pointed out above, equations for rotation or roll of the aircraf~- are straightforward trigonometric expressions.
Similarly, translating the coordinate positions of the CRT
field memory to the center of the screen simply requires an addition of values representative of the offset from (0,0) to the center of the screen there the values tl2o~ 120) because of the fact that the memory is a 240 by 240 matrix). The manner in which the pitch adjustment correction is achieved .

7~

may be understood by reference to Figures 20 and 21 to be described below.
Figure 20 depicts, graphically, the trigonometric relationships between the point of obsQrvation of the observer at lQ and the noxmal perpendicular plane of the cockpit display screen 12, described previously, and the pitched plane 12P of that screen for a change in pitch of the aircraft through some pitch angle ~p. Namely, plane 12 represents the image plane of the cathode-ray tube display screen during level flight conditions, described above in accordance with the basic emb~diment of the invention, which plane is separated by distanc~ R from the point 10 of observation of the observer. The value of the pitch angle ~p of the aircraft of a level flight condition is 0. For a positive pitch attitude of the aircraft, namely for a nose-up condition, the value of the pitch angle ~p will represent some positive acute angle relative to level flight. Under these conditions, the separation between the point of observation 10 of the observer and the pitched plane of the display screen 12P remains the same value R but the infinite horizon line of sight changes from line 451, which intersects screen plane 12 at point 454, to line 452 which intersects the pitched plane 12P at point 455. Line 45~ also intersects plane 12 t which point represents the intersection of a ray from the observer onto the image plane 12 of data values which are actually stored in ~2~7~

the ping-pong memories, as described previously. For any line of observaticn from ~he point of the observer at 10 to a location on ~he pitched viewing screen 12P due to a pitchPd attitude of the aircraftj along a line 453, there will be a point 462 of intersection at pitched image plane 12P and a point of intersection on the i~age plane 12 at point 453 correspvnding to the stored values of data in the screen memories. Line of observation 453 makes an acute angle ~1 relative to infinite horizon line 452. Thus, relative to the normal infinite horizon line 451 which intersects the stored perpendicular image plane 12, line of observation 453 is pltched at some angle ~N' where ~N = ~P~ ~l The scanning clrcuitry of the cockpit display CRT scans locations on the pitched imagine plane 12P, not the locations on the stored imagine plane 12. Accordingly, the objective of the trigonometric pitch conversion scheme illustrated graphically in Figures 20 and 21 is to convert points corresponding to the scanned CRT display screen 12P to memory locations in the stored screen memories so that the stored data can be accessed and presented at the proper locations on the display screen in the aircraft cockpit.
From the aircraft's quidance system, the pitch of the aircraft ~p is provided and the scanning circuitry of the cockpit CRT supplies signals representative of the coordinates of the position of the scanning electron beam on the face of .

~Z~L72~;~

the CRT screen, namely the X and Y coordinates on image plane 12P. Simply put, known quantities are the pitch angle ~p and the value YRl The objective is to find the location of point 463 in the stored imagine plane so that the data thexeat may be accessed and projected at point 462 on the pitched image plane~ This is achieved as ~ollows. Since R
and YRl are given, one may derive the equation Tan 91 = YRl/R.
The effective intersection of the infinite horizon line 452 at point 461 on the stored image screen.12 has an effective vertical separation of Y~p from normal infinite horizon line 451~ Y~p may be derived as YAp = R Tan ~p.
Similarly., the location o the intersection of line of observation 453 at point 463 on the stored image plane has a vaLue YAN derived as:
YAN = R Tan ~N = R Tan (~p + ~
Since YAl YAN YAp then YA1 may be defined as:
YAl = R Tan l~p + ~1) ~ R Tan ~p.
Since ~1 = Arctan YRl/R, it can be seen that the location of point 463, namely the value YAl on the stored memory plane 12, which is to be accessed and reproduced on the pitched image plane 12P~ is represented simply in terms o a trigonometric algorithm of known quantities. The circuitry through which the algorithm or the quantity YA1 in accordance with the above equation is implemented will be described below ~Z~7~7~

in connection with Figures 23-27. Before describing that implementation, however, it is useful to consider the location of the horizontal component of a respective pixel location on the pitched image plane.
Referring to Figure 21, l`ine 453 represents a line from the observer at point 10 which is perpendicular to a Line in the plane of both the pitched ima~e screen 12P and the stored image plane 12 and which is coplanar with some line of observation 474 that intersects pitched image 12P at point 471 and stored image plane 12 at poin~ 473. The objective in this case is to locate the horizontal component XA1 of point 473 in the stored image plane 12 that will be translated into point 471 which is separated from line 453 by distance XRl in the pitched image plane 12P. The vertical distance of both points 463 and 473 is the same, as shown in Figure 20, discussed above.
Now, similar to the case of the determination of the vertical component or Y component o~ the stored data in the screen memory, here the aim is to locate the horizontal component XA1 that will produce a data value at point 471 in the pitched image plane 12P. From the graphical representation shown in Figure 21, the horizontal or X value of point 471 relative to point 462, whi~h is located at the center of the display screen, may be represented in terms of an acute angle 0N t~at line of observation 474 makes with a 7~

line from the observer 10 to the center of the screen. Thus, the horizontal or column value of point 471 as defined by the value of XRl may be defined as:

XR1 = R1 Tan 0N
The magnitude of Rl depends upon the pitch of the aircraft and the particular line of observation relative to the pitch of the aircraft namely the cosine of a1.
Accordingly, Rl = R/cos ~1 .
Using the above ~expressions, the separation XR1 may be defined as:

XR1 = R Tan 0N/COS ~1 This expression may be written as 0N Rl cos a1.
Referring again to Figure 21, the location of point 473 on the stored image plane 12 may be defined in terms of angle 0N and the separation distance R2 from -the observer to the effective location on the stored image plane, namely XA1 = R2 Tan 0N
The value R2 may be defined in terms of the pitch of the aircraft and the line of observation of interest, namely the total value ~N or R2 = R/cos ~N.
From this and the foregoing expression one obtains:
XA1 = R Tan 0N/COS 0N' Therefore, the location of point 471 may be written as :
XAl - XR1 cos ~l/cos ~N
As mentioned previously, the read address calculation circuitry carries out a roll~ pitch ~screen centering translation sequence in order to access from the screen memory a pixel intensity signal for the proper location of the cockpit display s~reen in accordance with the attitude of the aircraft. In the sequence, the roll corxection is carried out first by the rotation correction circuitry shown schematically in Figure 22.
This circuitry consists essentially of a set of counters 504 and 511, trigonometric multiplier memories or PROMs 513 and 514, and a downstream set of multipliers and adders~ The interconnection and operation of the rotation correction circuitry shown in Figure 22 will be best understood by describing how it generates the translated values for the XR and YR components. To begin the processing of a field (odd or even) of interest, a value representative of the roll angle of the aircraft ~R~ supplied over link 444 is coupled to a set of PROMs 513 and 514. Prom 513 is simply a lookup table from which the value sin ~R is derived, while PROM 514 is another lookup table from which the cosine value of the roll angle cos ~R is derived. The digital code representative of thP value sin ~R is supplied over line 515 as one input of each of multipliers 521 and 523. Similarly, a .

7~7~

code representation of the value cos OR is supplied over link 516 to multipllers 522 and 524. These code values on respective links 515 and 516 are applied to the Y inputs of multipliers 521-524 and are clocked into input registers of these multipliers in response to a field clock supplied over link 501. A field load signal supplied over line 502 loads a value supplied over link 503 (here a code indicative of the value ~120) into the Yc downcounter 504. At the beginning of a line, this downcounter 504 is decremented in response to a line clock signal 506. At the same time an Xc upcounter 511 is load with the value -120 supplied over link 507. Then, for each pixel clock of a horizontal line s~pplied over link 512, the Xc upcounter 511 is incremented by one. Thus~ at the beginning of the first line of a field, the Yc downcounter is set at the value ~120 and the Xc upcounter is set at the value -120. The first pixel clock then lo~ds the Xc count which is supplied over link 522 to the X inputs of multipliers 522 and 523 into registers that are connected to these ports.
Similarly, it loads the contents of Yc downcounter 504 which is coupled over link 505 to the X inputs of multipliers 521 and 524 into the registers that are connected to these ports.
The first pixel clock increments the Xc upcounter 511 such that the value Xc = -119. The next or second pixe]. clock then loads multipliers 522 and 523 with the next Xc value and this process continues until 240 pixel clocks have been generated.

7~

, .

7~7~

At this time, the value of the Xc up counter is +119.
At the start o the second line, the value Yc = -~119 and the value Xc = -120. The above process is repeated until all 240 lines have been processed.
Multipliers 521-524 operate to multiply the values that are supplied to their X and Y input ports and produce product codes over output links 531-534. Namely, for any value Xc and Yc, the outpu~ of multiplier 521 is Yc sin 6R and the output of multiplier 522 is the value Xc cosin ~R. Adding these two output values on lines 531 and 532 in adder 541 produces the roll X correction XR = Yc sin 9R + Xc cos ~R-Multiplier 523 is programmed to output to the 2'scomplement of its product so that the output of multiplier 523 is ~~c sin ~R. The output of multiplier 524 is a code corresponding to the value Yc cos 6R~ so that adding the products supplied by multipliers 523 and 524 on links 533 and 534 in adder 542 produces an output code on link 544 y = y cos ~R ~ Xc sin QR
Using the rotation coordinate code values XR and YR
derived by the rotation correction circuitry of Figure 2~, pitch correction values are then produced in accordance with the pitch correction circuitry a block diagram of which is shown in Figure 23. In addition, the pitch correction circuitry adds into the resulting codes translation values for taking into account the fact that roll and pitch maneuvers 7~

take place relative to the center of the screen which corresponds to coordinate locations (120, 120) in the field memories 401, 402. Accordingly, the coordinate positions of respective data values in the field memories XM~ YM may be represented by the expressions XM = ~A + 120 =
XR cos ~l/cos ~N + 120, and M Y~ ~ 120=
R Tan (~p ~ ~1) ~ R Tan ~p + 120.
Referring now to~Figure 23, the interconnection and operation of the components of the pitch correction circuitry for implementing the above equations will be described. At the beginning o~ a field (odd or even) a field pulse signal is applied over link 555 to clear a pair of registers 552 and 565. Line 555 is also applied to the control input of multiplexer 563 one input of which receives the code corresponding to the value 120 over line 507 and the other input of which is coupled over line 567 to the output of a complement circuit 566. The output of multiplexer 563 i5 coupled as one input to adder 564.
The input o~ register 552 is coupled to the output of a PROM 551 which receives the code corresponding to the roll-corrected value YR producted by the roll correction circuitry of Figure 22 over link 544. PROM 551 constitutes a lookup table which takes the arctangent of.the value YR/R

~Z~l'7~7~

m~ntioned above. The output of register 552 is coupled to another lookup table PROM 573 which produces an output corresponding to cos ~1 and to one input of an adder 553. A
second input of adder 553 is coupled over link 443 to the aircrat's attitude sensor circuitry which generates a code correspondin~ to the pitch angle of the aircraft ~p. With register 552 bei.ng initially cleared by a field pulse on line 555, the output of adder 553 is simply a code corresponding to the pitch angle 443, which code is coupled to register 554. Oh the ~irst pixel clock that is supplied over line 51~, this code is loaded into register 554. The contents of register 554 are coupled to a pair of lookup table circuits 561 and 575. Lookup table 561 constitutes a PROM
which produces the value R Tan ~N' whereas lookup table circuit 575 constitutes a PROM which produces the value of 1/cos 6N. PROM 561 produces an output code corresponding to the expression R Tan ~p and on the next pixel clock on line 512 this value is loaded into register 562. Since, as mentioned above, the field pulse on line 555 causes multiplexer 563 to couple the code.value -120 over link 507 to adder 564, the output of adder 564 is the value (R Tan 0p -120). This value is loaded into register RS at the end of field pulse signal on line 555. The contents of register 565 are complemented by 2's complement circuit 566 and supplied as a second input to multiplexer 563 on.lin~ 567.

.

'727~

Namely, link 567 carries the value -R Tan ~p +120. This value is also coupled to counter 284 to set the address for the infinite horizon by the address calculation circuit to the line buffers, ~escribed above in conjunction with Figure 14.
In the description to follow, the sequenciny of operations that takes place in pitch correction circuit of Figure 23 will be explained with reference to Tahle 1, shown in Figure 24 which tabulates the pipeline action of the components of the pitch correction çircuit of Figure 23.
As mentioned previously, the sequence of operations that takes place calls for the roll correction values XR, YR
generated in the circuitry of Figure 22 to be supplied to the pitch correc~ion circult of Figure 23. The XR value is supplied over link 543 to a register 572 while ~hP Y~ value is applied over link 544 to PROM 551. On the fixst pixel clock supplied over link 512, register 552 is loaded with the value ~1 as derived by PROM 551, as mentioned above.
PROM 551 is programmed to compute ~1 (1) from the known input YRl. The pitch angle value ~p supplied over link 443 is added in adder 553 to the output of register 552 to provide an effective output ~p ~ ~1 (1). On the next pixel clock on line 512, register 552 is loaded with a new incremental value ~1 (2) which is derived by PROM 551 on the basis of a new YR value (YR2) and the output of adde~ 553 is loaded into register 554. Thus, the contents of register 554 at this time 27~

are ~N1 = ~P ~ . On the next pixel on line 512, register 562 is loaded with the value computed by PROM 561 (R Tan ~Nl) This pixel clock also loads registers 552 and 554 with their new values as shown in Table 1 (Figure 24).
Since PROM 551 is programmed to calculate the value ~1 = Tan 1 yR/R, where YR is the value calculated from the rotation circuit supplied over line 544, a new YR value is calculated for each pixel clock such that there are 240 separate YR (i) values, resulting in 240 ~1 values. The registers employed downst~eam of the calculation circuits allow the outputs of the calculation circuits to become stable when clocked. Thus, register 552 permits the output of PROM 551 to be stable when register 552 is clocked. The pitch angle value ~p supplied over link 443 is summed in adder 553 with the contents of register 552 (01) 50 that adder 553 produces the output ~N = ~P ~ ~1 This result will be stable by the time the next pixel clock on line 512 loads that value into registe.r 554. PROM 561 calculates the value R Tan eN and its output will be stable at the next pixel clock on line 512.
This value is loaded into register R4 which supplies to adder 564 the value R Tan aN which is summed with the value -120 and stored in register 565 as mentioned above. The output o adder 564 will be stable at the next pixel clock and this value is stored in register 571, which is also coupled to recelve the pixel clock on line 512. Thus, register 571 ~ 7~

contains the value R Tan ~p + a l - R Tan ~p ~ 120.
From Table 1 it can be seen that it takes four pixel clocks from the beginning of the operation to derive the value YM on link 425 at the ou~put of register 571.
For the derivation o ~he XM ~alues, the XR codes from the roll calculation circuit on line 543 is supplied to a register 572. Register 572 is clocked by the same pixel clock on line 512 that clocks the registers YM calculations to insure that the XM and YM values are in phase with one another. PROM 573 calculates cos ~1 and supplies its output as one input to a multiplier 574. Multiplier 574 multiplies this value by the contents of register 572 XR (i) to produce the product ~ (i) cos ~1 (i), which value will be stable upon the occurrence of the next pixel clock. This value in turn, is supplied as one input to a multiplier 576 which receives as its second input the output of a PROM 575, which calculates the value 1/cos ~N ~i). Thus, the output of the multiplier 576 is the value XR (i) cos 61 (i)/cos ~N (i).
This value is added in adder 577 with a code corresponding to the numerical value ~120 or center screen location, as mentioned above.
When the next clock pulse on line 512, the output of adder 577 is loaded into register 578 so that registex 578 contains the value XR (i) cos ~1 (i)/cos ~N (.i) ~120. Fxom Table 2 ~Figure 25~, it can be seen that it also takes four ~L2~ 3~

pixel clocks from the begi.nning of the operation to derive the value XM on line 426, just as it took four pixel clocks to derive the value YM on line 425. These values XM and YM are supplied to the field memories (Figure.15) for reading out the contents thereof to be supplied to the cockpit display CRT.
It should be noted, however, that the XM and YM values may fall outside of the CRT field memory so that a limit comparison operation, similar to that carried out for the basic embodiment of ~he invention m~st be provided.
Specifically~ the XM and YM values must be checked for the limits according to the expression:

' IXM Or YM) C 240.
If either the..value,XM or YM is outside of these limits, the brightness value for the pixel displayed on the cockpit screen will be set such that a black dot will appear. This is accomplished by the circuitry shown in Figure 26.
As shown in Figure 26, the XM and Y~ values are coupled over links 426 and 425 to the A inputs of respective comparators 612 and 614. The B inputs of these comparators receive a code over link 613 corresponding to the upper limit ~240. The output of comparator 612 goes high if the value of XM exceeds the value +240 and the output of comparator 614 goes high if the valuP of YM exceeds the upper limit ~240.
Otherwise, the outputs of each comparator over l.i.nk 6~7 and 615, respectively, is low.

~2~'7~7~Z

The signs o~ the coordinate values XM, YM are supplied over links 601, 602 through inverters 603, 604 to respective inputs of NAND gate 605, the output of which is coupled over link 606 as a third input of NO~ gate 611. The output of NOR
gate 611 is coupled to the D input of a clocked flip-flop 616, the clock input of which is the pixel clock over link 512.
Since the XM and YM values are encoded in 2's complement, the sign bit on either of links 601 and 602 is a "1" for any negative number. Accordingly, if either XM sign or YM sign is a "1" the Q output of flip-flop 616 will be set to a "1" on the next pixel clock supplied over link 512. Similarly, if either value of XM or YM is greater than +240, the output of the corresponding comparator 612 or 614 will be a "1", which will also cause the flip-flop 616 to be a set to "l" on the next pixel clock. The Q output of flip-flop 616 is coupled over link 617 to cause the pixel of interest to have a value corresponding to a black dot.
In additlon to the roll and pitch correction circuitry described shown in Figures 22 and 23, described above, the read out control circuitry of E'igure 18 may contain a nonlinear screen correction PROM to which the Xm and Ym coder on links 425 and 426 are coupled for mapping the CRT display locations onto a nonlinear, e.gO dome, image plane. Namely, in the embodiments described above, the image screen of interest was assumed to be planar. For achieving image 7~

projection in a non-planar surface (e.g. canopy dome) a look~up table of compensating or correction coder is prepared based upon the geometry of the display surface ~o be employed and the Xm, Ym coder generated by the roll/pitch correction circuitry are used to access a pair of "dome"-corrected values (Xm'~ Ym'~ from the look-up table which~ in turn, control the cockpit display so that the resulting projected image will not be distorted by the nonlinearities of the surface upon which it is projected.
~ s will be appreciated from the foregoing description, the present invention is capable of providing a perspective presentation of the terrain over which an aircraft is flying based upon elevation data stored in the form of a digital map through a simple trigonometric conversion scheme. By physically rotating the cockpit display unit 90, the basic storage/buffer mechanism may be accomplished using a pai~ of line buffers. On the other hand, conventional ping-pong screen memories that afford the storage of a complete field of data, may be used and, as such, provide for adjustment of the projected image for a number of control variables including change in attitude troll/pitch) and image screen nonlinearities (e.g. dome projection).
While we have shown and described several embodiments in accordance with the present invention, it is understood that the same is not limited thereto but is susceptible of numerous changes and modifications as known to a person skilled in the art, and we therefore do not wish to be limited to the details shown and described herein but intend to cover all such changes and modifications as are obvious to one of ordinary skill in the art~

. .

Claims (81)

The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows:
1. For use with a terrain map storage apparatus in which data representative of the elevation of said terrain over a prescribed geographical area is stored, a method of producing, on a display screen, a perspective image of said terrain to an observer comprising the steps of:
(a) establishing the geographical position of said observer on said terrain map;
(b) translating points which lie along a plurality of first lines, said, first lines extending from said geographical position of said observer and traversing said map, onto locations on said display screen in accordance with the effective intersections of a plurality of second lines with a prescribed image window, said image window having an effective elevation and geographical position on said terrain map corresponding to the display screen as seen by said observer, said second lines extending from the effective elevation of said observer at the established geographical position thereof through said points on said terrain map; and (c) producing at translated locations on said display screen respective images of said points of said terrain.
2. A method according to claim 1, wherein step (c) comprises generating said respective images of said points in accordance with the slope of said terrain at said points.
3. A method according to claim 2, wherein step (c) comprises generating said respective images of said points in accordance with the effective degree of shading produced at said points by the slope of said terrain thereat relative to a prescribed illumination of said terrain.
4. A method according to claim 1, wherein said plurality of first lines defines a predetermined field of view about a selected direction of observation from said observer on said terrain map.
5. A method according to claim 1, wherein step (b) comprises the steps of determining the respective terrain elevations of said points on said plurality of first lines and establishing said translated locations on said display screen in accordance with a prescribed relationship between the differences between said effective elevation of said observer and said respective terrain elevations and the effective distances between said observer and each of said screen and said points.
6. A method according to claim 5, wherein data stored in said terrain map storage apparatus define the elevations for a matrix of geographical locations on said terrain map.
7. A method according to claim 6, wherein step (b) comprises interpolating the elevations of said respective points from the elevations of said matrix of geographical locations on said terrain map.
8. A method according to claim 7, wherein step (b) comprises interpolating the elevations of said terrain at a respective one of said points in accordance with the slope of said terrain thereat.
9. A method according to claim 8, wherein said slope of said terrain at said respective one of said points is defined in accordance with elevation data values of said matrix the geographical locations of which on said terrain map are adjacent to said respective one of said points.
10. A method according to claim 9, wherein said slope of said terrain at said respective one of said points is further defined in accordance with the geographical location of said respective one of said points relative to one of said adjacent geographical locations of said matrix.
11. A method according to claim 1, wherein said plurality of first lines defines a predetermined field of view about a selected direction of observation from said observer on said terrain map, and step (c) comprises generating respective images of said points in accordance with the effective degree of shading produced at said points by the slope of said terrain thereat relative to a prescribed illumination of said terrain from a predetermined direction relative to said selected direction of observation.
12. A method according to claim 11, wherein said predetermined direction is orthogonal to said selected direction of observation.
13. A method according to claim 1, wherein the data stored in said terrain map storage apparatus is updated in accordance with information representative of the change of position of said observer relative to said terrain.
14. A method according to claim 13, wherein said information representative of the change of position of said observer corresponds to the relative movement between said observer and said terrain and said data is representative of points of said terrain as contained within the field of view of said observer based upon the relative movement of said observer and said terrain.
15. A method according to claim 14, wherein said terrain map storage apparatus is adapted for use with a vehicle for travelling over said terrain and in which said observer and display screen are to be situated, and wherein said information representative of the change of position of said observer corresponds to the travel of said vehicle over said terrain.
16. A method according to claim 15, wherein said plurality of first lines defines a predetermined field of view of said observer about the direction of travel of said vehicle, and step (c) comprises generating respective images of said points in accordance with the effective degree of shading produced at said points by the slope of said terrain relative to a prescribed illumination of said terrain from a predetermined direction relative to said direction of travel.
17. A method according to claim 1, wherein step (b) comprises sequentially identifying a plurality of points on said terrain map along each of said plurality of first lines and translating selected ones of said plurality of points along a respective one of said first lines onto successive pixel locations that lie along a respective vertical line, as viewed by said observer, on said display screen.
18. A method according to claim 17, wherein the locations of points on a respective ith one of said first lines are interleaved with respect to the locations of points on a respective (i+1)th one of said first lines.
19. A method according to claim 18, wherein a selected one of the plurality of points on said terrain map along a respective one of said first lines corresponds to a point on said terrain map a respective second line from the position of said observer through which effectively intersects a pixel position of said display screen.
20. A method according to claim 19, wherein step (b) comprises interpolating the elevations of said terrain at a respective one of said points in accordance with the slope of said terrain thereat.
21. A method according to claim 20, wherein data stored in said terrain map storage apparatus define the elevations for a matrix of geographical locations on said terrain map.
22. A method according to claim 21, wherein said slope of said terrain at said respective one of said points is defined in accordance with elevation data values of said matrix the geographical locations of which on said terrain map are adjacent to said respective one of said points.
23. A method according to claim 22, wherein said slope of said terrain at said respective one of said points is further defined in accordance with the geographical location of said respective one of said points relative to one of said adjacent geographical locations of said matrix.
24. A method according to claim 19, wherein said plurality of first lines defines a predetermined field of view about a selected direction of observation from said observer on said terrain map, and step (c) comprises generating respective images of said points in accordance with the effective degree of shading produced at said points by the slope of said terrain thereat relative to a prescribed illumination of said terrain from a predetermined direction relative to said selected direction of observation.
25. A method according to claim 17, wherein step (c) comprises storing pixel intensity information for successive pixels lying along a respective vertical line of said display screen for each respective first line, points along which have been translated into vertical pixel locations on said display screen in step (b) and coupling said pixel intensity information to said successive pixels.
26. A method according to claim 25, wherein step (c) comprises storing pixel intensity information for successive pixels lying along a respective ith vertical line of said display screen corresponding to an ith one of said first lines, and coupling pixel intensity information to successive pixels along a respective (i-1)ith vertical line on said display screen corresponding to an (i-1)th one of said first lines.
27. A method according to claim 1, wherein step (c) comprises producing said respective images in accordance with the attitude of said display screen relative to a predetermined attitude.
28. A method according to claim 15, wherein step (c) comprises producing said respective images in accordance with the attitude of said vehicle in which said display screen is situated relative to a predetermined attitude of said vehicle.
29. A method according to claim 28, wherein said attitude includes the roll of said vehicle.
30. A method according to claim 28, wherein said attitude includes the pitch of said vehicle.
31. A method according to claim 28, wherein said attitude includes the roll and the pitch of said vehicle.
32. A method according to claim 1, wherein step (c) includes producing said respective images in accordance with the shape of said display screen.
33. For use with a terrain map storage apparatus in which data representative of the elevation of said terrain over a prescribed geographical area is stored, a method of generating display information that is to be coupled to a display apparatus for producing, on a display screen thereof, a perspective image of said terrain to an observer comprising the steps of:
(a) establishing the geographical position of said observer on said terrain map;
(b) identifying a plurality of points on said terrain map along each of a plurality of first lines, said first lines extending from said geographical position of said observer and traversing said map, (c) translating points on respective ones of said first lines on said terrain map onto successive pixel locations on said display screen that lie along respective vertical lines which, as viewed by said observer on said display screen, coincide with said respective ones of said first lines; and (d) for each of said translated points, generating a respective image signal for the associated point on said terrain.
34. A method according to claim 33, wherein step (d) comprises generating a respective image signal in accordance with the slope of the terrain at its corresponding translated point.
35. A method according to claim 34, wherein step (d) comprises generating a respective image signal in accordance with the effective degree of shading produced at said point by the slope of the terrain thereof relative to a prescribed illumination of said terrain.
36. A method according to claim 34, wherein said pixel locations corresponds to the effective intersections of a plurality of second lines with a prescribed image window, said window having an effective elevation and geographical position on said terrain map corresponding to the display screen as seen by said observer, said second lines extending from the effective elevation of said observer at the established geographical position thereof through said points on said terrain map.
37. A method according to claim 36, wherein step (c) comprises the steps of determining the respective terrain elevations of said points on said on said plurality of first lines and establishing said successive pixel locations on said display screen in accordance with a prescribed relationship between the differences between the effective elevation of said observer an said respective terrain elevations and the effective distances between said observer and each of said screen and said points.
38. A method according to claim 37, wherein step (c) comprises interpolating the elevation of said terrain at a respective one of said points in accordance with the slope of the terrain thereat.
39. A method according to claim 38, wherein data stored in said terrain map storage apparatus define the elevations for a matrix of geographical locations on said terrain map.
40. A method according to claim 39, wherein said slope of said terrain at said respective one of said points is defined in accordance with elevation data values of said matrix the geographical locations of which on said terrain map are adjacent to said respective one of said points.
41. A method according to claim 40, wherein said slope of said terrain at said respective one of said points is further defined in accordance with the geographical location of said respective one of said points relative to one of said adjacent geographical locations of said matrix.
42. A method according to claim 33, wherein said plurality of first lines defines a predetermined field of view about a selected direction of observation from said observer on said terrain map, and step (d) comprises generating respective image signals for said points in accordance with the effective degree of shading produced at said points by the slope of the terrain thereat relative to a prescribed illumination of said terrain from a predetermined direction relative to said selected direction of observation.
43. A method according to claim 42, wherein said terrain map storage apparatus is adapted for use with a vehicle for travelling over said terrain and in which said observer and said display screen are to be situated, and wherein the direction of travel of said vehicle corresponds to said selected direction of observation.
44. A method according to claim 33, wherein step (d) comprises storing respective image signals for successive pixels lying along lying along a respective vertical line of said display screen for each respective first line, and wherein said method further includes the step of (e) coupling said produced display information to said display screen.
45. A method according to claim 44, wherein step (d) comprises storing pixel intensity information for successive pixels lying along a respective ith vertical line of said display screen corresponding to an ith one of said first lines, and step (e) comprises coupling pixel intensity information to successive pixels along a respective (i-1)th vertical line of said display screen corresponding to an (i-1)th one of said first lines.
46. A method according to claim 33, wherein the locations of points on a respective ith one of said first lines are interleaved with respect to the locations of points on a respective (i+1)th one of said first lines.
47. A method according to claim 33, wherein step (d) comprises generating a respective image signal in accordance with the attitude of said display screen relative to a predetermined attitude.
48. A method according to claim 33, wherein step (d) comprises generating a respective image signal in accordance with the shape of said display screen.
49. A method according to claim 43, wherein step (d) comprises generating respective image signals in accordance with the attitude of said vehicle in which said display screen is situated relative to a predetermined attitude of said vehicle.
50. A method according to claim 49, wherein said attitude includes the roll of said vehicle.
51. A method according to claim 49, wherein said attitude includes the pitch of said vehicle.
52. A method according to claim 49, wherein said attitude includes the roll and the pitch of said vehicle.
53. For use with a terrain map storage device in which data representative of the elevation of said terrain over a prescribed geographical area is stored, an apparatus for generating display information that is to be coupled to a display device for producing, on a display screen thereof, a perspective image of said terrain to an observer comprising:
first means for accessing said storage device so as to obtain respective elevation data values for a plurality of points on said terrain map along each of a plurality of first lines, said first lines extending from a location on said terrain map corresponding to an established geographical position of said observer and traversing said map;
second means, coupled to said first means and to said terrain map storage device, for generating, for points on respective ones of said first lines on said terrain map, display screen pixel location signals, representative of successive pixel locations on said display screen that lie along respective vertical lines which, as viewed by said observer on said display screen, coincide with said respective ones of said first lines; and third means, coupled to said first and second means, for generating respective terrain image-representative pixel intensity signals, for each of said display screen pixel location signals generated by said second means.
54. An apparatus according to claim 53, wherein said second means includes means for generating signals representative of the slope of said points on said terrain map, and said third means includes means for generating a respective pixel intensity signal in accordance with the slope of the terrain at the point for which a respective display screen pixel location signal has been generated by said second means.
55. An apparatus according to claim 54, wherein said third means includes means for generating a respective pixel intensity signal in accordance with the effective degree of shading produced at said point by the slope of the terrain thereat relative to a prescribed illumination of said terrain.
56. An apparatus according to claim 53, wherein said pixel locations correspond to the effective intersections of a plurality of second lines with a prescribed image window, said window having an effective elevation and geographical position or said terrain map corresponding to the display screen as seen by said observer, said second line, extending from the effective elevation of said observer at the established geographical position thereof through said points on said terrain map.
57. An apparatus according to claim 56, wherein second means comprises means for determining the respective terrain elevations of said points on said on said plurality of first lines and generating said display screen pixel location signals in accordance with a prescribed relationship between the differences between the effective elevation of said observer an said respective terrain elevations and the effective distances between said observer and each of said screen and said points.
58. An apparatus according to claim 57, wherein said second means includes means for interpolating the elevation of said terrain at a respective one of said points in accordance with the slope of the terrain threat,
59. An apparatus according to claim 53, wherein said plurality of first lines defines a predetermined field of view about a selected direction of observation from said observer on said terrain map, and said third means includes means for generating respective pixel intensity signals for said points in accordance with the effective degree of shading produced at said points by the slope of the terrain thereat relative to a prescribed illumination of said terrain from a predetermined direction relative to said selected direction of observation.
60. An apparatus according to claim 59, wherein said terrain map storage device is adapted for use with a vehicle for travelling over said terrain and in which said observer and said display screen are to be situated, and wherein the direction of travel of said vehicle corresponds to said selected direction of observation.
61. An apparatus according to claim 53, wherein said third means includes means for storing respective pixel intensity signals for successive pixels lying along lying along a respective vertical line of said display screen for each respective first line, and wherein said apparatus further includes fourth means, coupled to said third means, for coupling said pixel intensity signals to said display screen.
62. An apparatus according to claim 61, wherein said third means includes means for storing pixel intensity signals for successive pixels lying along a respective ith vertical line of said display screen corresponding to an ith one of said first lines, and said fourth means comprises means for coupling pixel intensity signals to successive pixels along a respective (i-1)th vertical line of said display screen corresponding to an (i-1)th one of said first lines.
63. An apparatus according to claim 53, wherein data stored in said terrain map storage device define the elevations for a matrix of geographical locations on said terrain map.
64. An apparatus according to claim 63, wherein said third means include means for interpolating the elevations of said respective points from the elevations of said matrix of geographical locations on said terrain map.
65. An apparatus according to claim 64, wherein said third means includes means for interpolating the elevations of said terrain at a respective one of said points in accordance with the slope of said terrain thereat.
66. An apparatus according to claim 65, wherein said slope of said terrain at said respective one of said points is defined in accordance with elevation data values of said matrix the geographical locations of which on said terrain map are adjacent to said respective one of said points.
67. An apparatus according to claim 66, wherein said slope of said terrain at said respective one of said points is further defined in accordance with the geographical location of said respective one of said points relative to one of said adjacent geographical locations of said matrix.
68. An apparatus according to claim 53, wherein said terrain map storage device includes means for updating the data stored there in accordance with information supplied thereto representative of the change of position of said observer relative to said terrain.
69. An apparatus according to claim 68, wherein said information representative of the change of position of said observer corresponds to the relative movement between said observer and said terrain and said data is representative of points of said terrain as contained within the field of view of said observer based upon the relative movement of said observer and said terrain.
70. An apparatus according to claim 69, wherein said terrain map storage apparatus is adapted for use with a vehicle for travelling over said terrain and in which said observer and display screen are to be situated, and wherein said information representative of the change of position of said observer corresponds to the travel of said vehicle over said terrain.
71. An apparatus according to claim 70, wherein said plurality of first lines defines a predetermined field of view of said observer about the direction of travel of said vehicle, and said third means includes means for generating respective pixel intensity signals associated with said points in accordance with the effective degree of shading produced at said points by the slope of said terrain relative to a prescribed illumination of said terrain from a predetermined direction relative to said direction of travel.
72. An apparatus according to claim 53, wherein the locations of points on a respective ith one of said first lines are interleaved with respect to the locations of points on a respective (i+1)th one of said first lines.
73. An apparatus according to claim 53, wherein said third means includes means for generating said respective terrain image - representative pixel location signals in accordance with the attitude of said display screen relative to a predetermined attitude.
74. An apparatus according to claim 53, wherein said third means includes means for generating said respective terrain image - representative signals in accordance with the shape of said display screen.
75. An apparatus according to claim 60, wherein said third means comprises correction means for generating said respective terrain-image representative pixel signals in accordance with the attitude of said vehicle in which said display screen is situated relative to a predetermined attitude of said vehicle.
76. An apparatus according to claim 75, wherein said attitude includes the roll of said vehicle.
77. An apparatus according to claim 75, wherein said attitude includes the pitch of said vehicle.
78. An apparatus according to claim 75, wherein said attitude includes the roll and pitch of said vehicle.
79. An apparatus according to claim 75, wherein said correction means includes memory means for storing respective terrain-image representative pixel signals generated in accordance with said predetermined attitude of said vehicle, and means for controllably reading out said pixel signals in accordance with said attitude of said vehicle.
80. An apparatus according to claim 79, wherein said attitude includes the roll and pitch of said vehicle.
81. An apparatus according to claim 79, wherein said controllably reading out means comprises means for accessing the contents of said memory in accordance with successive roll and pitch corrections representative of the attitude of said vehicle relative to said predetermined attitude.
CA000459583A 1983-07-25 1984-07-24 Real time perspective display employing digital map generator Expired CA1217272A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US06/517,037 US4682160A (en) 1983-07-25 1983-07-25 Real time perspective display employing digital map generator
US517,037 1983-07-25

Publications (1)

Publication Number Publication Date
CA1217272A true CA1217272A (en) 1987-01-27

Family

ID=24058124

Family Applications (1)

Application Number Title Priority Date Filing Date
CA000459583A Expired CA1217272A (en) 1983-07-25 1984-07-24 Real time perspective display employing digital map generator

Country Status (3)

Country Link
US (1) US4682160A (en)
CA (1) CA1217272A (en)
GB (1) GB2144608B (en)

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4763280A (en) * 1985-04-29 1988-08-09 Evans & Sutherland Computer Corp. Curvilinear dynamic image generation system
US4952922A (en) * 1985-07-18 1990-08-28 Hughes Aircraft Company Predictive look ahead memory management for computer image generation in simulators
FR2588405B1 (en) * 1985-10-08 1991-03-29 Thomson Csf GRAPHIC DISPLAY DEVICE
GB2181929B (en) * 1985-10-21 1989-09-20 Sony Corp Methods of and apparatus for video signal processing
GB8605713D0 (en) * 1986-03-07 1986-10-29 Gec Avionics Displays
GB8613447D0 (en) * 1986-06-03 1986-07-09 Quantel Ltd Video image processing systems
GB8620433D0 (en) * 1986-08-22 1987-01-14 Gec Avionics Displays
JPH01501178A (en) * 1986-09-11 1989-04-20 ヒューズ・エアクラフト・カンパニー Digital visual sensing simulation system for photorealistic screen formation
FR2610752B1 (en) * 1987-02-10 1989-07-21 Sagem METHOD FOR REPRESENTING THE PERSPECTIVE IMAGE OF A FIELD AND SYSTEM FOR IMPLEMENTING SAME
JP3138264B2 (en) * 1988-06-21 2001-02-26 ソニー株式会社 Image processing method and apparatus
US5091960A (en) * 1988-09-26 1992-02-25 Visual Information Technologies, Inc. High-speed image rendering method using look-ahead images
US5091867A (en) * 1989-03-20 1992-02-25 Honeywell Inc. Method and apparatus for generating display figures with three degrees of freedom
US5067019A (en) * 1989-03-31 1991-11-19 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Programmable remapper for image processing
EP0397071A3 (en) * 1989-05-10 1993-06-16 Honeywell Inc. Method for perspective position mapping
US4985854A (en) * 1989-05-15 1991-01-15 Honeywell Inc. Method for rapid generation of photo-realistic imagery
DE3916545A1 (en) * 1989-05-20 1990-11-22 Messerschmitt Boelkow Blohm MISSION TRAINING SYSTEM FOR AIRCRAFT
US5379215A (en) * 1991-02-25 1995-01-03 Douglas P. Kruhoeffer Method for creating a 3-D image of terrain and associated weather
WO1993000647A2 (en) * 1991-06-21 1993-01-07 Unitech Research, Inc. Real time three dimensional geo-referenced digital orthophotograph-based positioning, navigation, collision avoidance and decision support system
GB9302271D0 (en) * 1993-02-05 1993-03-24 Robinson Max The visual presentation of information derived for a 3d image system
WO1995010822A1 (en) * 1993-10-15 1995-04-20 Evans & Sutherland Computer Corporation Direct rendering of textured height fields
FR2714503A1 (en) * 1993-12-29 1995-06-30 Philips Laboratoire Electroniq Image processing method and device for constructing from a source image a target image with change of perspective.
US5630035A (en) * 1994-01-18 1997-05-13 Honneywell Inc. Method of sampling a terrain data base
US6005581A (en) * 1994-01-18 1999-12-21 Honeywell, Inc. Terrain elevation path manager
IL112187A0 (en) * 1994-01-18 1995-03-15 Honeywell Inc Optimized radial scan technique
US5563988A (en) * 1994-08-01 1996-10-08 Massachusetts Institute Of Technology Method and system for facilitating wireless, full-body, real-time user interaction with a digitally represented visual environment
US5857066A (en) * 1994-12-30 1999-01-05 Naturaland Trust Method and system for producing an improved hiking trail map
US5682525A (en) 1995-01-11 1997-10-28 Civix Corporation System and methods for remotely accessing a selected group of items of interest from a database
WO1996024216A1 (en) 1995-01-31 1996-08-08 Transcenic, Inc. Spatial referenced photography
US5604534A (en) * 1995-05-24 1997-02-18 Omni Solutions International, Ltd. Direct digital airborne panoramic camera system and method
US6052648A (en) * 1996-04-12 2000-04-18 Earthwatch Communications, Inc. Method and system for display of weather-related information
US5995903A (en) * 1996-11-12 1999-11-30 Smith; Eric L. Method and system for assisting navigation using rendered terrain imagery
US5977990A (en) * 1997-06-30 1999-11-02 Northrop Grumman Corporation Parallel computer for real time map synthesis
US6404431B1 (en) 1998-04-13 2002-06-11 Northrop Grumman Corporation Virtual map store/cartographic processor
GB9909163D0 (en) 1999-04-21 1999-06-16 Image Scan Holdings Plc Automatic defect detection
JP2000315132A (en) * 1999-04-30 2000-11-14 Sony Corp Device and method for information processing and medium
US7053894B2 (en) * 2001-01-09 2006-05-30 Intel Corporation Compression of surface light fields
IL143414A0 (en) 2001-05-23 2004-06-20 Rafael Armament Dev Authority A method and system for improving situational awareness of command and control units
FR2826769B1 (en) * 2001-06-29 2003-09-05 Thales Sa METHOD FOR DISPLAYING MAPPING INFORMATION ON AIRCRAFT SCREEN
JP4174559B2 (en) * 2001-10-26 2008-11-05 独立行政法人 宇宙航空研究開発機構 Advanced visibility information providing system and method using satellite image and flight obstacle recognition system and method
US6718261B2 (en) * 2002-02-21 2004-04-06 Lockheed Martin Corporation Architecture for real-time maintenance of distributed mission plans
FR2838272B1 (en) * 2002-04-09 2004-07-16 St Microelectronics Sa METHOD AND DEVICE FOR CORRECTING ROTATION OF A VIDEO DISPLAY
US7818317B1 (en) * 2003-09-09 2010-10-19 James Roskind Location-based tasks
US7345687B2 (en) * 2005-08-16 2008-03-18 International Business Machines Corporation Adaptive sampling of a static data set
IL172797A (en) * 2005-12-25 2012-09-24 Elbit Systems Ltd Real-time image scanning and processing
EP2104930A2 (en) 2006-12-12 2009-09-30 Evans & Sutherland Computer Corporation System and method for aligning rgb light in a single modulator projector
US8031193B1 (en) * 2007-01-25 2011-10-04 Rockwell Collins, Inc. Dynamic light shading in terrain rendering applications
US7940196B2 (en) * 2007-03-22 2011-05-10 Honeywell International Inc. System and method for indicating the field of view of a three dimensional display on a two dimensional display
US8358317B2 (en) 2008-05-23 2013-01-22 Evans & Sutherland Computer Corporation System and method for displaying a planar image on a curved surface
US8702248B1 (en) 2008-06-11 2014-04-22 Evans & Sutherland Computer Corporation Projection method for reducing interpixel gaps on a viewing surface
US8077378B1 (en) 2008-11-12 2011-12-13 Evans & Sutherland Computer Corporation Calibration system and method for light modulation device
ES2606568T3 (en) * 2009-10-26 2017-03-24 L-3 Communications Avionics Systems, Inc. System and procedure to visualize runways and terrain in synthetic vision systems
US9641826B1 (en) 2011-10-06 2017-05-02 Evans & Sutherland Computer Corporation System and method for displaying distant 3-D stereo on a dome surface
CN112529361B (en) * 2020-11-13 2024-03-22 许昌华杰公路勘察设计有限责任公司 Highway investigation route selection method based on smart phone and digital topography

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2921124A (en) * 1956-12-10 1960-01-12 Bell Telephone Labor Inc Method and apparatus for reducing television bandwidth
US3454822A (en) * 1968-01-15 1969-07-08 Lee Harrison Means and method for generating shadows on continuous surfaces in an image produced by an electronic image generator
GB1520452A (en) * 1974-08-07 1978-08-09 Gen Electric Electronic curved surfache simulator
US4177579A (en) * 1978-03-24 1979-12-11 The Singer Company Simulation technique for generating a visual representation of an illuminated area
GB2051525A (en) * 1979-06-15 1981-01-14 Redifon Simulation Ltd C.G.I.-Surface textures
US4442495A (en) * 1980-02-27 1984-04-10 Cadtrak Corporation Real time toroidal pan
US4616217A (en) * 1981-05-22 1986-10-07 The Marconi Company Limited Visual simulators, computer generated imagery, and display systems

Also Published As

Publication number Publication date
US4682160A (en) 1987-07-21
GB2144608A (en) 1985-03-06
GB2144608B (en) 1987-04-01
GB8418862D0 (en) 1984-08-30

Similar Documents

Publication Publication Date Title
CA1217272A (en) Real time perspective display employing digital map generator
US4489389A (en) Real time video perspective digital map display
US4660157A (en) Real time video perspective digital map display method
US4970682A (en) Digital map generator and display system
US4343037A (en) Visual display systems of the computer generated image type
JP2836684B2 (en) Radar signal display
EP0341645B1 (en) Digital mapping display apparatus
US4520506A (en) Method and system for compression and reconstruction of cultural data for use in a digital moving map display
US4205389A (en) Apparatus for generating a raster image from line segments
EP0137109A1 (en) Image generating apparatus for producing from the co-ordinates of the end points of a line, a two-dimensional image depicting the line as viewed by the observer
JPS62118387A (en) Efficient memory cell configuration for high grade video object generator
GB2102259A (en) Road map display system for automotive vehicles
US4656467A (en) TV graphic displays without quantizing errors from compact image memory
EP0465108A2 (en) 3-D Weather digital radar landmass simulation
US4899295A (en) Video signal processing
GB2194718A (en) Display methods and apparatus
GB2061074A (en) Improved visual display systems for computer generated images
EP0068852A2 (en) A method and an apparatus for treating range and direction information data
US4702698A (en) Digital radar generator
JP2781477B2 (en) Digital map generator
JP3490774B2 (en) How to generate geospecific textures
US4583191A (en) Scan converter
Albertz et al. Mapping from space—cartographic applications of satellite image data
US4421484A (en) Digital simulation apparatus
GB2258979A (en) Method and device for the synthesis of three-dimensional animated map-type images

Legal Events

Date Code Title Description
MKEX Expiry
MKEX Expiry

Effective date: 20040724