The District of Columbia

Road (RoadPly)

Road (RoadPly)

by Office of the Chief Technology Officer on 06/12/2009

About: Geographic Information System, Planimetrics, structure, buildings
Taking place at: United States of America (USA), Washington, D.C., District of Columbia, D.C.


Roads. The dataset contains polygons representing planimetric roads, created as part of the DC Geographic Information System (DC GIS) for the D.C. Office of the Chief Technology Officer (OCTO). These features were originally captured in 1999 and updated in 2005 and 2008.

The following planimetric layers were updated:

- Building Polygons (BldgPly)
- Bridge and Tunnel Polygons (BrgTunPly)
- Horizontal and Vertical Control Points (GeoControlPt)
- Obscured Area Polygons (ObsAreaPly)
- Railroad Lines (RailRdLn)
- Road, Parking, and Driveway Polygons (RoadPly)
- Sidewalk Polygons (SidewalkPly)
- Wooded Areas (WoodPly)

All DC GIS data is stored and exported in Maryland State Plane coordinates NAD 83 meters.



This data is used for the planning and management of Washington, D.C. by local government agencies.

Access constraints

Contact OCTO GIS. Any data obtained outside of OCTO GIS are unauthorized copies.

Use constraints

Acknowledgment of the DC Geographic Information Systems Program (DC GIS). The District of Columbia Government makes no claims as to the completeness, accuracy or content of any data contained hereon, and makes no representation of any kind, including, but not limited to, the warranty of the accuracy or fitness for a particular use, nor are any such warranties to be implied or inferred with respect to the information or data furnished herein.

Point of contact

GIS Data Coordinator
in D.C. Office of the Chief Technology Officer

441 4th St NW, Suite 930 South
Washington D.C., 20001, USA

fax: (202)727-5660
hours: 8:30 AM - 5:30 PM


D.C. Office of the Chief of Technology Officer (OCTO)

Data accuracy

Validated by source and/or responsible agency.

Logical consistency
OCTO developed and operated a QA/QC program to check the deliverables. Generally, the program consists of various types of processes grouped into visual checks, automated procedures, edge-matching routines, specialized checks, and field verification. Through the application of these processes, the data's spatial and attribute accuracy, usability, data documentation adherence, and specialized characteristics are checked.

Staff identified issues in a shapfile with appropriate descriptions for the vendor to fix. The data was delivered in preliminary format for a thorough review and identification of issues. The vendor fixed the data and delivered final data which OCTO checked to ensure the vendor made the fixes appropriately.

Validated by source and/or responsible agency.

Horizontal positional accuracy
For the 1999 data that was not updated in 2005, it was mapped at a 1:1000 scale. This means that the horizontal accuracy is guaranteed to meet National Map Accuracy standards at 1:1000, which calls for 90% of well defined points to fall within .85 feet of true position. This is slightly more accurate that the 2005 data.

For the 2005 data, the horizontal accuracy of the orthorectified images is mainly determined by the accuracy of the aerotriangulation and digital surface model (DSM). For each rectified image, an RMSE value for all of the standard errors of the tie/pass/control points located in that image and computed by the aerotriangulation solution was calculated. The DSM accuracy assessment was achieved by comparing the aerotriangulation-derived elevation with the elevation of the DSM. In addition, visual examination was employed to assess all tiles and its relative edge match. All results were examined for consistency and its compliance with the ASPRS Standards for Large Scale Mapping at 1 to 1200 which indicates that the orthos will meet 1 foot RMSE at the 95% certainty level.

Vertical positional accuracy

Lineage / Sources

  • Aerial Photography of Washington, D.C.
    filmstrip, This aerial photography was composed of 24 flight lines and a total of 1023 exposures. Imagery was obtained at an altitude of 1,100 meters above mean terrain (AMT) 7200. The mission was flown with two Wild RC30 cameras serial no. 5368 with 153.743 mm lens serial number 13413 and serial no. 5324 with 153.247 mm lens serial number 13365 with ABGPS.
    Aerial Photography
    by EarthData International, Inc, 08/04/06
  • Report of Survey Washington, DC Area
    electronic mail system, TerraSurv established 30 photo identification control points to support the aerotriangulation process. Continuously Operating Reference Station (CORS) station USNO (PIDAI7403) was used as the control for this project. The horizontal datum was the North American Datum of 1983, CORS adjustment (NAD 1983 CORS). The vertical datum was the North American Vertical Datum of 1988 (NAVD 1988).
    GPS ground control
    by TerraSurv, Inc, 08/01/05
  • Aerial Photography of Washington, D.C.
    filmstrip, New aerial photography was collected in the Spring of 2008 to support new orthoimagery and planimetric updates. This aerial photography was composed of approximately 52 flight lines and a total of 1552 exposures. Imagery was obtained at an altitude of 8,900 ft above mean terrain (AMT). The mission was flown with one Vecxel Ultracam X camera and collected ABGPS and IMU for each exposure.
    aerial photography
    by Sanborn Corporation, 20090612
Lineage / Processes

in EarthData International, Inc

7320 Executive Way
Frederick MD, 21704, USA

phone: (301)948-8550
fax: (301)963-2064
hours: 8:30 Am - 5:00 PM Mon - Fri
This process describes data originally captured in 1999 and applies to features not updated in 2005. The CAPTUREYEAR field contains the data of origin for the feature.

Earthdata Maryland (EDMD) transferred control points, pass and tie points from the 1995 analytical aerotriangulation solution. Control was transferred from 1995 working diapositives to diapositives of the new photography. Control points were transferred optically using a Wild PUG 4-point transfer device equipped with a 60-micron drill.

EDMD acquired new aerial photography of the District of Columbia in the spring of 1999 prior to the emergence of deciduous foliage. Aerial photography was exposed at an altitude of 7,200' AMT using a Wild RC-20 or RC-30 camera system that is equipped with forward motion compensation and a 12" (300mm) focal length lens cone 600'.

The flight design developed in 1995 was duplicated. The design calls for an approximate total of 1,000 frames in 24 North-South oriented flight lines. Forward overlap between frames within each flight line was 80%. Sidelap between adjacent flight lines was 48%. Aerial photography was captured in natural color using Kodak Aerocolor negative film type 2445. Aerial photography was not exposed using airborne GPS due to the existence of an existing control network created in 1995.

EDMD produced 2 full sets of contact prints. The prints were separated into a total of 4 sets of prints, 2 sets of even frames and 2 sets of odd frames. Three of these sets were delivered to NCPC for distribution to the District. One set was held by EDMD for reference purposes. The project manager determined if an odd or even set was withheld for work purposes. The planimetric mapping was developed using a set of stable base color film diapositives that are created from the 1999 photography. EDMD produced a flight line index of the completed photography. The positions of the photographs as recorded by the ASCOT navigation/control system was plotted over an existing raster or vector map of the District of Columbia. EDMD delivered 3 plots of the completed index to NCPC.

Planimetric Data Capture and Edit:

The following is a step-by-step description of the steps involved in the collection of planimetric features from the aerial photography.

Step 1 Planimetric data is captured within each flightline proceeding in an North-South or South-North progression. The diapositives and contact prints for each of the priority production areas are assigned to photogrammetric technician for data collection. All planimetric data is collected using Wild BC-2 first order analytical stereoplotting systems.

Step 2 The photogrammetric supervisor establishes the data collection conventions to be used for data capture. All planimetric data is initially collected in the Microstation environment. The photogrammetric supervisor establishes data collection conventions and establishes the data-layering schema, global origin and working units to be used for data collection. These preferences are programmed into Microstation to ensure continuity.

Step 3 Planimetric data is collected and saved to the designated network subdirectory. As data is collected, the technician reviews the planimetric information on the stereoplotter monitor to ensure that collection is complete and that the required features are depicted and assigned to the correct layer in the CAD design file. EDMD has developed an in-house CAD application to prepare the planimetric features for conversion to polygons once the data has been migrated into ARC/INFO. The technician collects planimetric features in a clockwise direction, which create centroids at the completion of each line. Each of these centroids delineates the outer boundary of a polygon. In areas where a line creates a boundary for multiple features (edge of pavement, edge of parking lot, edge of parking lot-edge of building), the line segment is duplicated and assigned to all layers of the file that contain the effected features.

Step 4 The photogrammetric technician completes information for inclusion in the metadata.

Step 5 As stereo models are completed on the analytical stereoplotter, the cartographic editor copies a number of files pertaining to a block of map coverage and merges the data sets into map-sheet-oriented format. The merged data is inspected for compliance with the database design.
Criteria for inspection include correct layer assignment, line color, and line style. The CAD application has incorporated quality control functions, which add temporary symbols to indicate the
necessary duplication of line segments to complete polygon closure in all affected features. The editor makes any corrections or additions interactively. If necessary, lines are snapped to ensure closure.

Step 5A The cartographic editor translates the ARC/INFO coverages of the existing planimetric data that was produced as part of Task 2 and the State Department modification into a CAD readable format. The line work between the new mapping and existing mapping is tied together to ensure compliance with the contract requirements for topology.

Step 6 As coverage areas are completed; the editor informs the ARC/INFO supervisor that data is ready for conversion and final quality control. The completed vector files are copied to a designated network subdirectory. In order to avoid the possibility that incomplete or unedited data sets are mistakenly imported during production, a separate network subdirectory is used for the data at each
stage in the production process. The network subdirectory structure is standardized for every project.

Step 7 Edited data sets are translated into ARC/INFO and polygon topology is created. The process used to create the final ARC/INFO coverages is described in the Attribute Accuracy Report Section.

Step 8 The cartographic technician records any pertinent dates or other information for completion of metadata.

QA/QC Plots

EDMD developed an ARC/INFO AML to generate hard copy plots of the 1995 digital orthophotos as part of Task. This routine will be modified to accommodate plotting of the planimetric and/or topographic data. The format will retain the graphic design that was developed in 1999. Design elements of the format and surround as well as the planimetric/topographic mapping will be fully compliant with the requirements stated in the contract modification. Upon completion of the initial editing and conversion of the data to ARC/INFO, EDMD will prepare a set of bond paper plots. Polygon features will be color coded for cartographic clarity and will enable NCPC quality inspectors to verify that no polygons overlap or are miscoded. A copy of the digital data that corresponds to the plots will be included to allow inspectors to review digital data content. NCPC will inspect the plots and mark any errors, omissions or mistakes on the plots. The edited plots are returned to
EDMD for correction. Once the data has been corrected, EDMD will produce a total of 3 sets of inkjet mylar plots containing the planimetric data only in the approved format surround.

ARC/INFO coverage development

Extracted from Librarian using Simple option into coverage, then imported to geodatabase. Removed all voids and empty polygons from shape. The NCPC ArcInfo Librarian dataset is made up of 350 tiles. Over 29 data layers were extracted from the Librarian using the following procedures (see planimetric process.xls).

1) In order to withdraw seamless data out of Librarian an extraction polygon must be created. The extraction polygon must encompass the span of all the tiles. Extracting the index layer of the librarian and dissolving the boundaries of the 350 polygons into 1 polygon accomplished this task.

The procedures for creating the extraction polygon are as follows:
? Extract the index layer using Arcview and converting the theme into a shapefile.
? Create a field within the feature table for which the individual polygon can dissolve from by assigning a value of 1 in it for all records.
? Using the geo-processor extension option, the shapefiles' records were dissolved into 1 poly by dissolving the records with the common field.
? The polygon theme was then transformed into a polygon coverage through ArcToolbox.
? Once the extraction polygon was prepped, data layer extraction was completely done in the ArcInfo environment.

Setting the ArcInfo Environment.
1) Launch the ArcInfo Command Prompt window.
2) Set the working directory ( arc: w d:\workspace\extraction)
3) Set the precision for the environment and to all processes done in the ArcInfo Session. ( arc: Precision double double )
4) Enter the librarian module (arc: Librarian)
5) Set the library volume (Librarian: library ncpc)
6) Set the extraction polygon coverage (Librarian: setcover index)
7) Set the layer to be extracted (Librarian : setlayer air)
8) Enter the command line to extract. (Librarian: extract OPTION # clip)

The extract command has a few options that can be set when extracting different types of layers such as polygons, lines, and points. Extract DISSOLVE is used for polygon layers only because it will extract the layer and merge the polygons where they are split by index tiles, thus removing additional polys. The ArcInfo commands involved with this option are Clip Dissolve and Build.

For Line and Point layers, the SIMPLE option was used to extract. The topology is left unbuilt when the information is extracted. Reason being that if the tile lines segment lines and create pseudo nodes. The user-id from the ArcInfo table were used to unsplit the lines and remove the pseudo node. Once that is done, the coverage can rebuild the topology.

Data Improvements

The line data coverages had tiles with some additional pseudo nodes. The pseudo nodes were probably created as a result of different digitizing techniques (neither right nor wrong). Using the Unsplit command we were able to clean these areas up by unsplitting only lines that had the same dxf-layer value and that shared a node.

Data cleanup

The polygon coverages extracted contained void (coded 9999) areas. In Arcedit, these ambiguous features were removed with the following technique.
Arc: ae (enter arcedit module)

Arcedit: ec coverage (enter the editing coverage)

Arcedit: ef poly (enter the feature to be edited)

Arcedit: select for dxf-layer = 'VOID' (querying for features in the attribute table under dxf-layer that contains attribute labeled VOID)

Arcedit: delete (delete the features found)

Arcedit: Save (save the changes)

The line coverages had the neatlines using the same method where the dxf-layer had values = 'NEATLINE'

Creating Shapefiles
All data sets were converted into shapefiles and clipped to the DC Boundary for import into the geodatabase.

From the Arc: prompt window, the command ARCSHAPE was used to create the shape coverages. This required the user to specify which coverage to convert, the feature class, and naming convention of the output shapefile.

in EarthData International, Inc

7320 Executive Way
Frederick MD, 21704, USA

phone: (301)948-8550
fax: (301)963-2064
hours: 8:30 Am - 5:00 PM Mon - Fri
Analytical Aerotriangulation:

Source photography - Wild RC-30 camera, natural color stable base.
Control - airborne GPS supplemented with photo identifiable field control.
Scanning - Z/I Imaging PhotoScan flatbed metric scanner.
Aerotriangulation - Photo-T.
Elevation Model - Lidar, autocorrelation and manual collection and update.
Radiometric Balancing - Proprietary and COTS Software (PhotoShop).
Orthorectification - Z/I Ortho Pro 4.0 software package.
Mosaic - Z/I Ortho Pro 4.0 software package.
Processed on Windows NT/2000 systems.

The ground control and airborne GPS data was integrated into a rigid network through the completion of a fully analytical bundle aerotriangulation adjustment.

1. The original aerial film was scanned at a resolution of 21 microns. The scans were produced using Z/I Imaging PhotoScan flatbed metric scanners.

2. The raster scans were given a preliminary visual check on the scanner workstation to ensure that the raster file size is correct and to verify that the tone and contrast were acceptable. A directory tree structure for the project was established on one of the workstations. This project was then accessed by other workstations through the network. The criteria used for establishment of the directory structure and file naming conventions accessed through the network avoids confusion or errors due to inconsistencies in digital data. The project area was defined using the relevant camera information that was obtained from the USGS camera calibration report for the aerial camera and the date of photography. The raster files were rotated to the correct orientation for mensuration on the softcopy workstation. The rotation of the raster files was necessary to accommodate different flight directions from one strip to the next. The technician verified that the datum and units of measurement for the supplied control were consistent with the project requirements.

3. The photogrammetric technician performed an automatic interior orientation for the frames in the project area. The softcopy systems that were used by the technicians have the ability to set up predefined fiducial templates for the aerial camera(s) used for the project. Using the template that was predefined in the interior orientation setup, the software identified and measured the eight fiducial positions for all the frames. Upon completion, the results were reviewed against the tolerance threshold. Any problems that occurred during the automatic interior orientation would cause the software to reject the frame and identify it as a potential problem. The operator then had the option to measure the fiducials manually.

4. The operator launched the point selection routine which automatically selected pass and tie points by an autocorrelation process. The correlation tool that is part of the routine identified the same point of contrast between multiple images in the Von Gruber locations. The interpolation tool can be adjusted by the operator depending on the type of land cover in the triangulation block. Factors that influence the settings include the amount of contrast and the sharpness of features present on the photography. A preliminary adjustment was run to identify pass points that had high residuals. This process was accomplished for each flight line or partial flight line to ensure that the network has sufficient levels of accuracy. The points were visited and the cause for any inaccuracy was identified and rectified. This process also identified any gaps where the point selection routine failed to establish a point. The operator interactively set any missing points.

5. The control and pass point measurement data was run through a final adjustment on the Z/I SSK PhotoT workstations. The PhotoT program created a results file with the RMSE results for all points within the block and their relation to one another. The photogrammetrist performing the adjustments used their experience to determine what course of action to take for any point falling outside specifications.

6. The bundle adjustment was run through thePhotoT software several times. The photogrammetrist increased the accuracy parameters for each subsequent iteration so, when the final adjustment was run, the RMSE for the project met the accuracy of 1 part in 10,000 of the flying height for the horizontal position (X and Y) and 1 part in 9,000 or better of the flying height in elevation (Z). The errors were expressed as a natural ratio of the flying height utilizing a one-sigma (95%) confidence level.

7. The accuracy of the final solution was verified by running the final adjustment, placing no constraints on any quality control points. The RMSE values for these points must fall within the tolerances above for the solution to be acceptable.

8. The final adjustment generates three files. The .txt file has all the results from the adjustment with the RMSE values for each point measured. The .XYZ file contains the adjusted X, Y, Z,coordinates for all the measured points and the .PHT file contains the exterior orientation parameters of each exposure station.

in EarthData International, Inc

7320 Executive Way
Frederick MD, 21704, U.S.

phone: (301)948-8550
fax: (301)963-2064
hours: 8:30 Am - 5:00 PM Mon - Fri
Digital Elevation Model (DEM):

Both Lidar and previously produced DEM data was available to support the production process. Following an analysis of the data the previously produced DEM was selected for update and use.

The following provides a step-by-step outline of the production process.

1. The existing DEM which was comprised of both gridded mass points from 10 to 20 meters with spots, and vertices of contour lines was converted to dgn files for compilation.

2. The DEM was then merged together in MicroStation V8, and then split into 34 tiles, approximately 3077m X 3029m.

3. The compilation team updated the data with breaklines where needed, and collected 3D bridges. 3D bridges were collected to prevent smearing and warping, caused by the elevation difference between the bare earth and the elevated bridges. Proprietary MDLs for Microstation were run to create a 10 to 15 meter buffer around the bridges and to clip the surrounding ground data.

4. The dgn files were then merged into four large areas for QC purposes. The files were imported into TerraSolid/TerraModler and a tin and a color relief was generated to search for any spikes or mismatches. This check in performed to fix any problems before going to the ortho stage. Large water areas were filled with elevation points.

5. Complex lines, shapes and arcs were dropped before delivering to the ortho department. A final level listing was run to ensure all the lines were dropped and the files were clean. This listing was provided to the ortho team.

in EarthData International, Inc

7320 Executive Way
Frederick MD, 21704, U.S.

phone: (301)948-8550
fax: (301)963-2064
hours: 8:30 Am - 5:00 PM Mon - Fri
Planimetric Data Capture:

The following planimetric layers were either updated from previous datasets or, created during the production process -

- Building Polygons (BldgPly)
- Bridge and Tunnel Polygons (BrgTunPly)
- Horizontal and Vertical Control Points (GeoControlPt)
- Hydrography Center Lines (HydroCenterLineLn)
- Metro Entrance Points (MetroEntPt)
- Obscured Area Polygons (ObsAreaPly)
- Railroad Lines (RailRdLn)
- Road, Parking, and Driveway Polygons (RoadPly)
- Sidewalk Polygons (SidewalkPly)
- Under Construction Areas (UnderConstPly)
- Wooded Areas (WoodPly)

The following guidelines were used for the collection of hydrography centerlines - Hydrography lines were collected in the direction of flow through the center of all visible stream course features. Hydrography centerlines were coded as hidden where streams flowed underneath features that obstructed visibility such as bridges and overpasses. Areas between visible stream courses, where the actual course could not be confidently determined based on stereo-photography, were connected using a separate connector code.

The following guidelines were used for the collection of obscured areas - Obscured area polygons were delineated in areas where features could not be confidently determined based on stereo-photography. Such instances included areas of deep shade or heavy vegetation.

The following guidelines were used for data capture and change detection:

For layers tagged as update through change detection, features were removed if they no longer existed in the photography, added if new, or modified if the geometry changed (i.e. building additions). All layers carry a date of capture to delineate which features have been updated.

For change detection methodology, Earthdata thoroughly reviewed the downtown area, mapping tiles that covered the downtown core, for change detection on all of the layers. Outside the downtown area, the contractor carefully reviewed the surrounding area for change detection on all of the layers ONLY WHERE THERE WERE BUILDING OR ROAD CHANGES.

The following processes were involved in updating existing planimetric data levels -

1. Client supplied planimetric data from 1999 was received as merged ESRI shapefile layers.

2. Client supplied layers were converted into Microstation DGN format for updating in the stereo compilation environment. The conversion process involved clipping the merged layers into more manageable tile based files. Polygon feature codes were maintained and updated in the Microstation environment through the use of unique text annotation to define polygon centroid labels.

3. The Microstation linework was draped to existing DEM points to create 3D datasets that could be updated in a stereo compilation environment.

4. The Microstation tiles were updated using the 2005 stereo imagery. Any updated features were coded to reflect a change in status from existing to new.

5. The final updated tiles were checked for proper attribution and coding.

6. The tile datasets were converted into ESRI Geodatabase format for final topological and visual quality control.

7. The Geodatabase topology was checked against a rules file to detect any dangles in linework along with overlapping and intersecting features.

8. Polygon datasets were checked for adjacent areas containing the same code along with multiple code labels within the same polygon.

9. A visual check using the 2005 orthophotography was performed to look for and correct any improper attribution or missing features.

10. A final Geodatabase file was prepared based on the DC OCTO planimetric data structure.


Most DC GIS datasets can be downloaded from "http//" on the data center page. Orders can
also be made by contacting OCTO GIS at "", especially for datasets not available for download.

Distribution liability
Acknowledgment of the DC Geographic Information Systems Program (DC GIS). The District of Columbia Government makes no claims as to the completeness, accuracy or content of any data contained hereon, and makes no representation of any kind, including, but not limited to, the warranty of the accuracy or fitness for a particular use, nor are any such warranties to be implied or inferred with respect to the information or data furnished herein.

Distributed by
GIS Data Coordinator
in D.C. Office of the Chief Technology

441 4th Street NW, Suite 930 South
Washington D.C., 20001, USA

fax: (202)727-5660
hours: 8:30 am - 5:00 pm




    Feature geometry.
  • GIS_ID
    OCTO GIS sequential identifier
  • FID / ESRI
    Internal feature number.
  • Shape / ESRI
    Feature geometry.
    Area of feature in internal units squared.


FGDC Content Standards for Digital Geospatial Metadata / FGDC-STD-001-1998 as of 06/22/2009

Provided by
GIS Data Coordinator
in D.C. Office of the Chief Technology

441 4th Street NW, Suite 930
Washington D.C., 20001, USA

fax: (202)727-5660
hours: 8:30 am - 5:00 pm