home > projects

On-Going Projects
 

Federal Highway Administration

Abstract: Completed  --Coming soon...

 

 

 


VOLPE Research Center

Abstract: Active  --Coming soon...

 

 

 



National Aeronautics and Space Administration

Abstract: Completed --Coming soon...

 

 

 

 

 

 


 

 

Past Projects

Optical Character Recognition Algorithms for Pattern Recognition in Image Processing (2006)

Abstract: This study reviewed state-of-the-art Optical Character Recognition (OCR) technology for highway sign applications, describes key problems with current technology, and proposes reliable new technology to meet future needs.  OCR has become commonly available in recent years, allowing computer programs to automatically identify and interpret text in scanned images. As a result, many organizations have developed solutions to document image interpretation problems using OCR.  However, only a few organizations are investigating specialized applications, such as highway sign OCR for digital photo log data.  Currently, effective OCR technology for automated interpretation of highway signs remains elusive, and a definitive standard has not been developed.  This study summarizes the reasons OCR of highway signs is currently unreliable, proposes effectual solutions to existing problems, and studies the benefits of OCR technology for automated highway sign interpretation.

Return to top


 

Development of Ultra Light Inertial Profiler (2004-2009)

Abstract: A prototype Ultra Light Inertial Profiler (ULIP) was invented, designed, implemented, and tested. ULIP was proposed as a potential system to fulfill two primary needs of the pavement smoothness program of the Federal Highway Administration (FHWA): 1) development of an efficient method of measuring certification sites for reference, and 2) development of an accurate low-speed device. The prototype ULIP is a SEGWAY Human Transporter4 (HT) equipped with triggers, a laser, and accelerometers. The first field tests were conducted at a medium-smooth research certification test site in northern Virginia (NOVA) on 6 January 2004. Data collected indicated a problem with profile precision, additional field tests were conducted during 2004. Investigation of the effect of ULIP pitch changes on profile precision were conducted and verified. The development of the ULIP is continued with the addition of macrotexture measurements and improved inertia profiling using a gyroscope. more details »

Applications: The ULIP can measure roughness on newly poured concrete pavements to assess their quality and to provide an opportunity to correct problems before final curing. As the ULIP is able to travel where ever wheel-chairs go, the condition of these traveled surfaces, including sidewalk accessibility and roughness, can be assessed to determine needed improvements.

Applications: Successful implementations of OCR technology in the field of document imaging allow problems like automated highway sign recognition to be simplified.  Because OCR problems can be solved for most document images, highway sign recognition research is focused on preprocessing sign images to match scanned text images as closely as possible.  Image contrast, geometry, and other parameters of highway sign images must be adjusted during preprocessing to compensate for variable lighting conditions, geometric distortions, and other effects.  This study illustrates how state-of-the-art OCR for highway sign applications can be successful for selected images and also shows its limitations for other types of common photo log images.  Finally, techniques for effective highway sign OCR are described for future development and implementation.

Return to top


 

Inductive Loop System Analysis (2005)

Abstract: A set of chain parameter matrices of 20 circuit elements and 10 transfer function elements was developed and implemented into SEQS in collaboration with Mr. Milton K. Mills.  The effort resulted in about three times as many elements in SEQS as originally planned. The theory and a set of examples were tested and verified in MathCAD. The superequation for the two-port applications has two broad groups of structures. The first one handles the circuits elements shown in Figure 20 with the four possible conditions of operation. There are two patterns of terminals, one with inputs, the other without. Condition zero is the only one which utilizes the pattern of terminals without inputs. The basic process of the two-port element updates the global and local two-dimensional matrices. The local matrix is passed through the upper terminals, the global through the lower. A finite difference algorithm was provided by Milton K. Mills in the form of theory, equations, Fortran code, and references needed and example applications. The equations of finite difference were used to solve Poisson’s equation for magnetic field problems and Laplace’s equation for electrical field problems. The problem of determining the capacitance between two round insulated wires used in inductive loop detector applications was solved using the above technique. All theory and examples were tested, validated, and verified in two independent programming environments.

Return to top


 

Development of Stereoscopic Image Algorithms (2005)

Abstract: The process of stereoscopy is applied to images captured by two forward-facing digital cameras in a moving vehicle. The first step is the calibration of the system. The geometry between the two views is described by the epipolar geometry. It is encapsulated in a 3x3 matrix known as the Fundamental Matrix F, which is a complete description of the intrinsic geometry (intrinsic parameters of cameras) and their relative position. At least eight points must be correctly matched in each image to satisfy the requirements of the least-squares solution, the SVD decomposition, and the singularity constraint. The RANSAC algorithm is used to search for a satisfactory solution for F by removing outliners (bad correspondences). The best solution is the one that maximizes the number of points whose distance to the model is below a certain threshold. The second step is the rectification of images and calculation of depth or range. By knowing the fundamental matrix, the images are rectified to the canonical stereo configuration where epipoles move to infinity. An important advantage is that, in this case, there is a simple relationship between disparity and depth. The stereo correspondence is reduced to a 1-D search problem during a dense 3D reconstruction.

Applications: Stereoscopy contributes to the fusion process of the DHM vehicle to define the three-dimensional geometry of the roadway.

Return to top


 

Numerical Optimization (2004)

Abstract: Pattern and gradient search optimization algorithms were applied to a two-port problem using a six dimensional amplifier transistor. Multiple methods were investigated: pattern search, gradient, neural network, and genetic algorithm. The pattern search method was based on the Hook and Jeeves approach. The Rhea method was also tested. The gradient search method selected was the conjugate gradient method using the Fletcher-Reeves approach. It requires the ability to compute the objective function’s gradient, or first derivative, at any arbitrary point in a specified range.  The Fletcher-Reeves method is identical to the Polack-Ribiere method, except for the method of increment in the derivative term. Another method of interest in the future was the Broyden-Fletcher-Goldfarb-Shanno method which is categorized as a metric search method. It differs from the Fletcher-Reeves method only in the method of retaining and accumulating the information to access convergence. Instead of a vector on dimension N, it requires a matrix of dimension NxN, where N is the number of dimensions in the objective function. Focusing on the Fletcher-Reeves method, the next point in the search is estimated as the sum of the previous point and a scalar times the estimation of the gradient at the previous point. The scalar is set using a bracketing technique and Brent’s method that minimizes round-off errors. Given the previous point, Pi, and the gradient at Pi, dPi, the search for the optimum scaling factor, ai, is performed in two steps that do not affect Pi and dPi. The next value, Pi+1, is equal to Pi + ai x dPi. The Fletcher-Reeves method was implemented in MathCad.

Applications: The collection of optimization methods were applied to the two-port solution of an amplifier-transistor.

Return to top


 

Data Mining Highway Engineering Databases

Abstract: There are many data mining tools available on the market. Many of them are for experienced statistical specialists. In this project, Scenario from Cognos Software was selected because it provides a good balance between accuracy and ease of use. A very moderate level of experience is required to get immediate data analysis results. This visual tool permits a first investigation to identify structures and relationships between the data.  Scenario uses the concept of targets and factors. Target is the field to be examined and factors are the fields that could influence the target. Scenario helps identify the “overall fit”, which is an estimate of how well the analysis explains the target’s variation. For that purpose, Scenario explores the data and groups them into segments. A segment contains data that are similar to each other in term of impact on the target. The model used in Scenario is based upon decision trees. Any discovery should be cross-checked with other approaches, such as logistic, sigmoidal or hinging hyperplanes.

Applications: Using the National Bridge Inventory System (NBI) data, a few analyses are presented. The approach of data mining the NBI can be extended to explore data for other projects with minimal effort.

Return to top


 

Multi-Dimensional Visualization (2004)

Abstract: The long-range purpose of the project was to increase the level of insight a researcher acquires from merging powerful visualization tools, large data sets, and corresponding computational hardware. Custom applications were also developed to explore tire-pavement interaction noise and warp and curl databases. One example using OpenDX focuses on determining the global pollution dispersion over a region-based on the information available at the micro level and the wind conditions. The detailed dispersion pattern vs. time is compared to the instantaneous average for the region as computed from the micro level data.  The other example depicts the evolution of many origin-destination patterns in the micro level data. The next objective is to compare the resulting patterns with other existing O-D data formats in use by the transportation community and to evaluate the strength and weakness of each. The focus of this work is to explore the power of merging visualization tools, computational tools, and large data sets using a personal computer, with specific emphasis on increasing problem insight for a researcher.

Applications: The consistent images of the total investigation process can lead to potential improvements and/or enhancements of the analysis process.  Visualization is not only a way to produce images for presentations at the end of the project, but can be integrated within the investigation process to test new hypotheses, and to spot data details and patterns not yet predicted. OpenDX was the primary software used to set up a visualization tool of vehicle-emissions produced at the micro level over a geographic area. Different scenarios of traffic simulation were used to depict and evaluate the impact of micro conditions on global pollution dispersion.

Return to top


 

Development of Digital Highway Measurement System (2003-2006)

Abstract: The Digital Highway Measurement (DHM) Vehicle uses multiple sensors to measure the horizontal and vertical alignments of roads and highways. Profiles of the roadside and characteristics of the pavement surface are also created, while traveling at normal traffic speeds and with accuracy not available commercially. The areas of improvement to the state-of-the-practice provided by the DHM vehicle are: a) the accuracy and precision of the vehicle position by at least an order of magnitude, b) elimination of vehicle lane wander to produce centerline traces, and c) no affect from stop-and-go operations in urban and rural environments where continuous GPS coverage is not available. A complete functional specification for the hardware and software for the DHM is included in the full task report. In addition, other analyses such as cross sections; roadside profiles; and image processing, including stereoscopy, pattern recognition, and GPR are discussed. In the full task report, graphs present the details of the comparisons within each type of data to support the analysis on the accuracy of the DHM. A summary of the results is provided in an executive report. Specifically, the accuracy and precision in establishing a centerline are discussed. The large data sets from each of the DHM sensors are fused to define an accurate geometry of the road and the roadside, the vehicle wander in the lane is then removed from the data to produce an accurate centerline trace. From this centerline an accurate project coordinate system is defined. The post processing software extracts key points of the horizontal alignment from the measured centerline. When possible, a similar post process is performed on the vertical profile to identify points of vertical curvature and tangency. For each site, six runs were analyzed. The results are contrasted with those obtained with a state-of-the-practice vehicle. The following sources of potential “ground truths” were considered: site plans, static laser scan using the CYRAX technology, and digital orthophotography from the Virginia Geographic Information Network.

Applications: The ability to determine the vehicle position on the road as a function of time, so that its trajectory can be accurately determined for the purpose of: a) analyzing the trajectory to determine if the driver is falling asleep, b) analyzing the trajectory to determine if the driver is driving erratically, c) advising the driver of approach to road geometry variations (filtered to a necessary level in order to prevent excessive false alarms), d) including the vehicle dynamics to determine potential loss of control due to excessive speed, e) including measured road surface friction condition for dry and wet conditions, f) including warning of hazards under reduced visibility conditions, g) including highway construction locations provided by the states, h) providing localized profile of the grades at the approach and exit of railroad crossings, i) providing overhead bridge clearances (measured simultaneously with other geometry data),  j) providing project-level condition assessment of existing conditions for planning and design, and inspection of final rehabilitation or new construction projects for pay-factors, and k) providing existing conditions information to highway-driver simulators and to interactive highway-safety design software.

Return to top


 

Application of HHT to Highway Engineering. (2003-2006)

Dr. Gagarin, under a task order agreement with FHWA, collaborated with Dr. N.E. Huang of NASA, under an inter-agency agreement with FHWA to study and apply the HHT to highway engineering problems.

Abstract: There are many non-stationary and nonlinear processes present in the highway environment. In the past, the methodologies for extracting engineering information were based primarily on the Fourier analysis and similar linear and stationary methods. The development of the Hilbert-Huang Transform (HHT) has provided a tool to analyze these same processes without the limiting assumptions of the above methods. The objective of the study is to identify problems which will benefit from use of the HHT. The paper describes the problems and data initially explored. The description and status of the following applications are in the paper: 1) decomposition of sound pressure levels for the analysis of tire-pavement interaction noise, 2) impact-echo experiment for the computation of quality factor for freeze-thaw damage in concrete, 3) error analysis, synchronization, and comparison of road profiles, 4) pavement thickness estimation using ground penetrating radar, 5) a review of falling weight deflectometer time-series data, and 6) filtering of strain measurements for bridge weigh-in-motion and condition assessment. Preliminary results are also discussed. Other areas under consideration are: 1) wind-structure interaction, 2) earthquake response, 3) scour in hydraulics, and 4) disturbances in traffic flow.

Applications: Inertial road profile measurements are widely used to assess the condition of existing pavements and monitor the quality control for smoothness of newly constructed pavements. A metric known as the International Roughness Index (IRI), computed from the inertial road profile quantifies the ride quality of the pavement. Within the inertial road profile, the variation of frequency-amplitude content versus distance is nonlinear and non-stationary. The Hilbert–Huang transform (HHT) and its Empirical Mode Decomposition (EMD) is well suited for nonlinear and non-stationary data. In the application of the HHT to inertial profile analysis, the intrinsic mode functions and their Hilbert transform can be used to: 1) represent the frequency/wavelength content of a profile; 2) filter the data in the distance domain in preparation for secondary analyses, and 3) compare two profiles and assess their similarities by performing simultaneous distance-frequency synchronization.

Return to top


HA-DGPS Diplexer Design and Optimization (2003)

Abstract: The findings and tools in the two-port circuit design utilities described above were applied to the analysis of the High Accuracy DGPS diplexer. The electrical performance of the current diplexer design will be determined. The solution is designed so that circuit component values can be changed either automatically using an optimizer, or manually to see if improved performance is possible. The potential use of neural network, and genetic algorithm optimizers in this problem were investigated. The electrical performance of the diplexer is displayed in both a phase/amplitude and Smith Chart graph.

Return to top


Relationship between Macro and Micro-texture (2002)

Abstract: In cooperation with the Virginia Department of Transportation, pavement surface texture and Skid Number was simultaneously collected on a continuous basis on a wide variety of pavement surfaces at the NASA Wallops Island runway test surfaces. The data allowed for a correlated analysis of Skid Number changing dynamically as the surface texture and surface features, such as joints, patches, faults, and profile change, thus a clearer picture of friction / macrotexture interaction is achieved.

Applications: A laser-based system for the measurement of texture on a continuous basis at a network level can provide a quick, relatively inexpensive, and comprehensive method of pavement friction condition assessment. This information can then be used as part of a pavement management system for allocating resources for pavement management and rehabilitation.The dynamic relationship between surface features and texture with Skid Number was clearly illustrated.  Current single point Skid Numbers for a test section and manual methods of surface feature and texture measurement are not truly representative of the dynamic nature of pavement surfaces with respect to friction. The work performed showed the need and benefit of simultaneous data collection using multiple sensors.

Return to top


Creation and Support of CRL (2002 – 2005)

Abstract: The Advanced Research Team (ART) worked on establishing a Computational Resource Laboratory (CRL) at the TFHRC. The CRL process is intended to run over many years. The need for such a laboratory is the result of the maturation of the SEQS software.  As the SEQS software has developed, the number of powerful analysis tools within it has steadily increased and ART is now turning its attention to streamlining the accessibility to these tools by other researchers at the TFHRC. The ART staff needed to provide a seamless access for all researchers at the TFRHC to the SEQS documentation and software with a few minor conditional constraints. The first step of this project conceptualized a total system that accomplishes the long-range objectives for the CRL. The “system” includes every type of item from hardware to software, both presently within SEQS or other external support software, or hardware that would maximize the operation of the CRL, including analytical training videos or self-paced training materials. The conceptualization process considered direct connection to a CRL server from any computer at the TFHRC.

Return to top


 

Multi-Dimensional Visualization (2002-2004)

Abstract: The long-range purpose of the project is to increase the level of insight a researcher acquires from merging powerful visualization tools, large data sets, and corresponding computational hardware. Custom applications were also developed to explore tire-pavement interaction noise and warp and curl databases. One example using OpenDX focuses on determining the global pollution dispersion over a region-based on the information available at the micro level and the wind conditions. The detailed dispersion pattern vs. time is compared to the instantaneous average for the region as computed from the micro level data.  The other example depicts the evolution of many origin-destination patterns in the micro level data. The next objective is to compare the resulting patterns with other existing O-D data formats in use by the transportation community and to evaluate the strength and weakness of each. The focus of this work is to explore the power of merging visualization tools, computational tools, and large data sets using a personal computer, with specific emphasis on increasing problem insight for a researcher. 

Applications: The consistent images of the total investigation process can lead to potential improvements and/or enhancements of the analysis process.  Visualization is not only a way to produce images for presentations at the end of the project, but can be integrated within the investigation process to test new hypotheses, and to spot data details and patterns not yet predicted. OpenDX was the primary software used to set up a visualization tool of vehicle-emissions produced at the micro level over a geographic area. Different scenarios of traffic simulation were used to depict and evaluate the impact of micro conditions on global pollution dispersion.

Return to top


 

Accelerometer Study and Light-Weight Profilers (2001)

Abstract: A parametric study of the inertial profiling technology using mathematical simulation concluded that the level of sensitivity of accelerometers was a source of error. Roll and pitch were also hypothesized as additional sources of error. The objective of this study was to determine the accelerometer requirements of high-speed inertial profilers to collect accurate profiles at speeds from 15 to 70 miles per hour and using both a 200 and 300 foot long-wave filter. This research consisted of 10 runs at each speed varying over the 15 to 70 miles per hour speed range, and using multiple accelerometers of various manufacturers and models.  Data Acquisition (DAQ) software, data mining tools and analysis applications were developed to study the effect of accelerometer sensors on the accuracy and precision of road profiles.  An electronics and hardware package was constructed using sensors available within the FHWA Turner-Fairbank Highway Research Center (TFHRC) Pavement Surface Analysis (PSA) Lab and from FLHD.  A follow-up experiment using two commercial light-weight profilers confirmed the effect of pitch and roll accelerations on the accuracy of inertial profiles.

Applications: The requirements produced by this research were adopted and implemented in the portable systems used by FLHD. These updated systems passed the Texas Department of Transportation and Texas Transportation Institutes Certification procedure, which is similar to the proposed AASHTO guide specification for the certification/validation of inertial profilers. Results were used by an AASHTO Expert Task Group in the development of inertial profiler guidelines.  This project also identified potential sources of measurement error in both high-speed and light-weight profilers subsequently confirmed in the later DHM and ULIP projects. The design of the experiment included “real world conditions” and the simulation methods used exposed limitations of one-dimensional inertial profiling technology.

Return to top


Warp and Curl (2001)

Abstract: The feasibility of measuring slab-curvature using a non contact, high-speed profiler system was assessed by the Pavement Surface Analysis Laboratory (PSAL) of the Turner-Fairbank Highway Research Center of the Federal Highway Administration (FHWA). The primary objectives of the pilot study were to (1) measure the curvature of the concrete slabs using existing profiling technology, (2) establish a standard procedure to identify warp and curl from profiler data, (3) quantify the change in curvature due to temperature changes, (4) determine the sampling interval required to measure curvature, (5) determine the relationship between curvature and roughness measurements, and (6) relate curvature data to premature transverse pavement cracking and construction conditions. The PSAL research profiler was used in the survey of 64 lane-miles of interstate highway in a period of 24 hours. A single pass of 16 miles in each lane was collected during four windows of two hours. The time windows were 4-6am, 6-8am, 11am-1pm, and 6-8pm. A procedure to measure joint rotation and faulting was established. A report focused on the analysis, including faulting and joint rotation, with the effect of temperature. The parallel coordinates approach to visualization was applied to concrete slab condition assessment. A visualization program named “WarpExplorer” was developed to browse the verification and validation database of inertial profiles. A visualization program named “CurlExplorer” was developed to browse the database of 16,000 concrete slabs that includes joint condition, slab profiles, roughness, cracking information, deformation parameters, and temperature.

Applications: Measuring the change in slab surface profiles and joint condition at different temperature loads using a high-speed inertial profiler provides a method of monitoring the condition of concrete slabs. Using parallel coordinates approach on more than twenty parameters related to the profile measurements, the slabs most likely to fail can be isolated from the entire set of slabs measured.

Return to top


 

Preparation of Software Specifications, Verifications, Validation, Evaluation, and Application of SEQS (Phase II) (2001)

Abstract: This project is the continuation of the previous work under a task order agreement by Starodub and FHWA. Limitations on the level of interactive help, number of equations, graphics, and functionality were addressed. The software for the following tasks was specified, designed, and implemented in the SEQS environment: a) Data Fusion and Mining of Tire-Pavement Interaction Noise, Pavement and Tire-Print Textures, b)Tire-Pavement Interaction Noise, c) Data Fusion and Mining of Skid Numbers and ROSANv Macro-texture, d) Data Mining of 1-D and 2-D ROSANv Macro-texture Samples, e) Development of Warp and Curl Analysis Algorithms, f) Data Reduction of validation sites in Delaware and Pennsylvania for Warp and Curl project, g) Automatic Sign Recognition System, h) Development of an Ultra Light Inertial Profiler, and i) Two-Port Electrical Circuit Elements for Electromagnetic modeling.

Applications: Starodub proposed to use SEQS as a means to disseminate the library of algorithms and references in the CRL and monitor the use of software of the CRL in the future. In addition, SEQS will be used in the training and technology transfer to the highway community. The ULIP product is an example of the use of SEQS to fuse the data acquisition process under the national instruments NIDAQ environment, and post processing under seven different libraries of analysis functions. Similar approaches can be applied to more complex system., SEQS is intended to be used by the FHWA.

Return to top


Development of Eigen Sign Recognition Algorithm (2000)

Abstract: The Automatic Sign Recognition System uses a local eigen feature method to define the optimum orientation of the dimensions in a digital image. The new method is also based on an eigen solution. However, the features of interest in a training set are processed globally to produce a series of statistically significant features to be searched in a digital image.  This method can be applied to various features of interest.

Return to top


Preparation of Software Specifications, Verification, Validation, Evaluation and Application of Equation Shell Software (Phase I) (2000)

Abstract: The Equation Shell Software is a computer tool which developed to facilitate data analysis. It incorporates research findings in the development of new highway products, assisting a small research staff with limited resources to develop convoluted solutions. EQS is an object-oriented programming system with a graphical user interface that allows a graphical definition and construction of algorithms. It can be used as a tool for rapid prototyping, verification, validation, visualization, production/diagnosis, and training/teaching. It is people-oriented/user-friendly and its components are reusable. To date, six major phases of development have been completed: 1. information gathering, 2. synthesis and verification of knowledge base, 3. identification of highway applications, 4. system design and development of functionality of EQS (three cycles), 5. preparation of a set of equations in EQS objects, with testing and checking, and 6. initial deployment.

Return to top


Development of Quality Factor Analysis Method of Freeze-Thaw Experiment Software (1999)

Abstract: The freeze-thaw experiment on small-size concrete beams requires two main components. The first is a data acquisition setup with two sensors -- an accelerometer and an impulse hammer, the second is analysis software to reduce the recorded data. Both components of the freeze-thaw experiment have been integrated into the EQS software for demonstration purposes. A basic data acquisition equation to collect simultaneous samples from two sensors was developed and tested with a signal generator and also with the two sensors. The analysis component was reviewed and can be integrated. The focus of the project is the development of a production version of the quality factor application shell.

Applications: The project was the first data acquisition problem within SEQS. The DHM and ULIP projects followed. Data mining using state-of-the-art signal processing method.

Return to top


Tire-Pavement Noise Interaction (1999)

Abstract: The Pavement Surface Analysis (PSA) team conducted a series of experiments at the National Aeronautics and Space Administration (NASA) base at Wallops Island, collecting data for a tire-pavement noise interaction study. The primary objective of the study was to define the relationship between noise levels, tire texture, and pavement texture. The Wallops Island NASA facility provided an excellent test site with over a dozen concrete or asphalt pavement surface conditions with and without grooves, and a variety of textures. Three types of vehicles ‑‑car, sport utility vehicle, and small truck‑‑ were used on eight of the pavement surfaces for two experimental setups. These setups were: 1) outside noise/single vehicle/controlled conditions, and 2) inside noise/single vehicle/controlled conditions. This project led to two additional task orders:  to study the relationship between macro-texture and sound pressure levels and to develop a data visualization tool. In addition, Dr. Gagarin collaborated with Dr. James Mekemson in the production of a ROSANv unit for CALTRANS to assist them in their follow-up on this research. A visualization software package named “WaveExplorer” was designed and developed to browse a limited database of tire-pavement interaction noise developed by the Pavement Surface Analysis Laboratory. The data for the database are from two sets of experiments: a) out-of-vehicle noise with a microphone placed near the travel lane, and b) in-vehicle noise with a microphone installed in the vehicle. Tire texture was estimated by obtaining a print of the tread pattern at the time of the tests and, using image processing, identifying the tire pattern line. Pavement texture was measured using ROSANv. A graphical user interface was developed to enable a user to select and change the value of the parameters controlling the types of tests to be heard and seen on the screen simultaneously.

Applications: After a published progress report on this research effort, six state DOTs inquired directly about the status and availability of the final product. A tire-pavement noise database was developed which contains not only the characteristics of the pavement surface, but also that of the tire texture.

Return to top


Automatic Sign Recognition and Inventory System (ASRS) using Photologs (1999-2001)

Abstract: A sequence of image processing functions is applied to identify the presence of features in the photologs as part of a prototype of an automatic sign inventory system. This ongoing effort will produce a system capable of recognizing a finite set of individual traffic signs from photolog data from the Highway Safety Information System (HSIS) database from eight states. To date, a computer program was developed to interface with the database and define training sets for pattern recognition definition. After the training sets were defined, a process of verifying and validating the set was created using a data browser. The data is subjected to a quality check procedure which determines the limits of the domain covered. The design and implementation of the pattern recognition model was completed. This work led to a task order to review the findings of Connecticut DOT. The algorithm within the ASRS system was retrained. The outliers identified by Connecticut DOT were in fact properly detected with the updated system.

Applications: The state of Connecticut is collaborating closely with the HSIS laboratory and eventually other states of the eight states will become users of the system. A software application driven by a proprietary algorithm was developed and copyrighted. The first version of the Automatic Sign Recognition System was deployed in Connecticut in January 2000.

Return to top


Development of a Design Equation for Development Length of Prestressing Strands (1998)

Abstract: The FHWA has produced an extensive database on development length of prestressing strands in full-size AASHTO girders Type VI with and without decks, small-size beams, and deck panels. Three diameter sizes were used, and uncoated and epoxy-coated surface conditions were included in the experimental phase. In order to respond to the immediate request of AASHTO, the focus of analysis was directed to uncoated results for full-size girders and small-size beams, including available information from all sources. An extensive structural analysis of the data was conducted to estimate effective and ultimate prestress strengths for each test. Combined with transfer and flexural-bond strength information, with statistically based criteria for inclusion of data in the final database, a design equation to estimate the development length of prestressing strands was produced and submitted to the AASHTO T-10 committee for review.

Applications: The equation has been approved by the AASHTO T-10 committee after a six-month review period by the states. It will be included in the AASHTO design specifications for the design of prestressed concrete highway bridges.

Return to top


Preparation of Software, Specifications, Verification, Validation, Evaluation and Application of Equation Shell Software. (1998 – 2006)

Abstract: The Equation Shell Software (EQS) is a computer tool which was developed to facilitate data analysis. It incorporates research findings in the development of new highway products, assisting a small research staff with limited resources to develop convoluted solutions. EQS is an object-oriented programming system with a graphical user interface that allows a graphical definition and construction of algorithms. It can be used as a tool for rapid prototyping, verification, validation, visualization, production/diagnosis, and training/teaching. It is people-oriented/user-friendly and its components are reusable. To date, six major phases of software development were completed: 1. information gathering, 2. synthesis and verification of knowledge base, 3. identification of highway applications, 4. system design and development of functionality of EQS (three cycles), 5. preparation of a set of equations in EQS objects, with testing and checking, and 6. initial deployment. Limitations on the level of interactive help, number of equations, graphics, and functionality were corrected. The software for the following tasks was specified, designed, and implemented in the SEQS environment: a) Data Fusion and Mining of Tire-Pavement Interaction Noise, Pavement and Tire-Print Textures, b)Tire-Pavement Interaction Noise, c) Data Fusion and Mining of Skid Numbers and ROSANv Macro-texture, d) Data Mining of 1-D and 2-D ROSANv Macro-texture Samples, e) Development of Warp and Curl Analysis Algorithms, f) Data Reduction of validation sites in Delaware and Pennsylvania for Warp and Curl project, g) Automatic Sign Recognition System, h) Development of an Ultra Light Inertial Profiler (ULIP), and i) Two-Port Electrical Circuit Elements for Electromagnetic modeling.

Applications: Starodub proposes to use SEQS as a means to disseminate the library of algorithms and references in the Computational Research Lab (CRL) and monitor the use of software of the CRL in the future. In addition, SEQS will be used in the training and technology transfer to the highway community. The ULIP product is an example of the use of SEQS to fuse the data acquisition process under the National Instruments NIDAQ environment, and post processing under seven different libraries of analysis functions. Similar approaches can be applied to more complex systems such as the Digital Highway Measurement (DHM) system.

Return to top


Development of ROSAN algorithms (1997-2000)

Abstract: In October 1997,  FHWA and Surfan Engineering Software, Inc.(SES, Inc.) signed a Cooperative Research and Development Agreement which supported the conversion process of the research prototype of the ROad Surface ANalyzer (ROSAN) into a product available to the highway industry, including state agencies, contractors, and researchers. After a year of effort, the CRADA team (FHWA and SES, Inc.) completed the first phase of product development.

Applications: Truly an intermodal venture, ROSAN has applications in all branches of the Department of Transportation (DOT). ROSAN has fulfilled the need to monitor the surface condition of runways, answering the FAA regulation requirements imposed on primary airports. Also, AMTRAK expressed interest during an exploratory meeting with SES, Inc. in Philadelphia to develop a version of ROSAN to survey the track‑pentograph‑catenary system along the Northeast corridor. A year after the initiation of the CRADA, SES, Inc. measured runway texture and grooves at Kennedy airport for the New‑York/ New‑Jersey Port Authority, supported aggregate segregation research at Auburn University, and, before the end of the year, an updated version of the ROSAN product with profiling capabilities became available.

Return to top


Visualization of Expert Systems 1997

Abstract: In preparation of a handbook of expert systems, the following were included: 1) drawings illustrating six-rule expert system, showing Hoffman regions (binary example), 2) drawings illustrating rule-based expert system, with numerical tests, showing nontrivial Hoffman regions, 3) drawings incorporating decision tables, 4) drawings showing flowchart instead of trees, 5) drawings illustrating atomic formulas, 6) study of proof of theorems, and 7) review of text, indicating possible need for details, clarification, and examples.

Return to top


Estimating Pier Scour with Artificial Neural Networks (1996)

Abstract: An artificial neural network (ANN) is used to estimate scour at bridge piers. The ANN is based on 515 sets of field data assembled in a report by Gao in 1992. Its performance is compared to the Chinese equations for live bed and clear water pier scour presented by Nordin. It was found that ANNs may be better in understanding the governing processes where many uncertain scour conditions exist than in predicting actual depth quantities.

Applications: Other researchers are emulating these findings and continuing this research. A software application driven by a proprietary algorithm was developed and copyrighted.

Return to top


Analysis of Sediment Transport Data (1996)

Abstract: An artificial neural network is used as a computing device for determining sediment transport in open channels. The application was developed from 1,455 data sets extracted from laboratory and field records assembled and used for developing formulas and regression equations used in BRI-STARS for sediment accounting. The network was trained to estimate sediment transport for all channel and bed material types. The advantage of this ANN application is that it can be nested in BRI-STARS or other sediment transport models without the need to select the most suitable transport equation.

Applications: Other researchers are emulating these findings and continuing this research. A software application driven by a proprietary algorithm was developed and copyrighted.

Return to top


Data Mining Highway Engineering Databases using Artificial Neural Networks (1994-1998)

Abstract: The applicability of multidimensional modeling of engineering and natural systems with emerging empirical numerical techniques labeled artificial neural networks (ANNs) was investigated. ANNs are built of large sets of simple computing units operating in parallel. The function embedded in each unit or activation function is defined by a few parameters with values determined during a training process based on a few simple rules.

Applications: A series of proprietary software applications were developed, of which the next two applications below are examples.

Return to top


Application of Signal Processing and Pattern Recognition (SP/PR) in Highway Engineering. (1993- 1997)

Abstract: A large number of computer-aided applications were developed in the 90's to monitor the highway system in an efficient and effective manner. The hardware, used to gather data from various environments and to process large volumes of floating point operations, has greatly improved in the past decade. Large databases are now common place, with gigabytes of data which can include images, arrays of discrete time series, or verbal descriptions. The control and analysis algorithms which have to handle these diverse forms of information often include signal processing and pattern recognition (SP/PR) methods. In the past, much of the SP/PR work has been primarily based on the Fourier transform; in recent decades, a variety of new approaches have been proposed as alternatives and complements to the Fourier transform approach.

Applications: The primary objective of this study was to classify and present in a systematic manner a selection of modern or second generation SP/PR methods for use by highway engineers. In the introduction, definitions of "signal processing" and "pattern recognition" were proposed to support the layout of this primer. Then, the format used to present each individual SP/PR method was discussed. An attempt was made to define the manner in which the reader should consider the information provided on a given method. The plane of information was related to the level of interest of the user, taking into account his expertise, degree of involvement in the technology, and job objectives. A copyrighted primer of methods was prepared containing a chart which summarizes the scope and applicability to highway problems of each type of method.

Return to top


Fatigue Analysis of Woodrow Wilson Bridge (1992)

Abstract: The rainflow method was applied to strain measurements collected on the Woodrow Wilson Bridge at fatigue critical structural details to estimate the number and level of stress cycles. A computer program was developed to automate the analysis process.

Applications: The results of the analysis were used in the report on the condition of the Woodrow Wilson Bridge prepared by the structures laboratory of TFHRC. Dr. Gagarin provided and used a customized version of his proprietary fatigue analysis software developed during his dissertation.

Return to top


Discriminant Analysis of Pull-Out Test Data of Prestressing Strands (1992)

Abstract: A FHWA memorandum was issued in 1989 to restrict existing design practice for determining the development length of prestressing strands. The structures division of the Turner-Fairbank Highway Research Center (TFHRC) researched the behavior of prestressing strands in concrete. One component of the research focused on the pull-out tests for prestressing strands. Linear discriminant analysis with multivariate analysis and Hotelling’s t2 quantified the differences in behavior of different strands.

Applications: After the publication of an article in the PCA journal and a presentation to leading members of prestressed-concrete research and industry, the limitations and strengths of the pull-out tests were accepted. Pull-out tests of prestressing strands were performed and the interpretation of the results followed the basic procedure described in the report prepared for FHWA on this project. Advanced multidimensional statistical analysis were configured and applied to this structural research problem. The copyrighted and proprietary algorithms and software are designed to interpret the data and present the results. With the method developed by Dr. Gagarin, the hypotheses proposed by the research experiment on prestressing strands were demonstrated.

Return to top


Bridge Weigh-In-Motion (1989-1992)

Abstract: A bridge is instrumented with strain gages on its beams and tape switches on the pavement surface near its approach. The deformations of the bridge are acquired synchronously with tape switch trigger events that are used to estimate the position of a vehicle on the structure as well as its axle configuration. With a model of the elastic behavior of the cross section, each axle weight is estimated using regression. A new estimation of axle configuration and weights utilizes the strain gage patterns without any tape switch information.

Applications: This new technology was not available prior to this research project. Bridge Weigh-In-Motion systems have been used in various states, including Colorado, Michigan, and California. There were three main contributions. The first presents the use of finite element analysis in the conventional use of Bridge Weigh-In-Motion and generation of stress patterns for the fatigue analysis of highway bridges. The second is the use of Bridge Weigh-In-Motion in the fatigue analysis. The third is the use of unique and proprietary pattern recognition algorithms – including artificial neural networks, Markov models, and matching techniques – to determine truck axle configuration and axle weights without the use of tape switches on the pavement.

Return to top