Article citation information:

Kozuba, J., Marcisz, M., Rzydzik, S., Paszkuta, M. Using unmanned aerial vehicles in recognizing terrain anomalies encountered in the gas pipeline right-of-way (row). Scientific Journal of Silesian University of Technology. Series Transport. 2024, 123, 57-73.
ISSN: 0209-3324.
DOI: https://doi.org/10.20858/sjsutst.2024.123.3.

 

 

Jarosław KOZUBA[1], Marek MARCISZ[2], Sebastian RZYDZIK[3], Marcin PASZKUTA[4]

 

 

 

USING UNMANNED AERIAL VEHICLES IN RECOGNIZING TERRAIN ANOMALIES ENCOUNTERED IN THE GAS PIPELINE RIGHT-OF-WAY (ROW)

 

Summary. The objective of the undertaken research was to characterize and evaluate the impact of weather and lighting conditions on recording terrain anomalies in the photographs obtained during a UAV photogrammetric flight. The present work describes the use and capabilities of the UAV in the mapping of photo acquisition conditions similar to those performed during inspection flights with the use of a manned helicopter equipped with a hyperspectral camera, in the target range of visible light. The research was conducted in the southern part of Poland (between Gliwice and Katowice), where 7 routes were selected, differing from one another in terms of terrain anomalies (buildings, types of land areas, vehicles, vegetation). In the studies, which involved photogrammetric flights performed using a UAV, different seasons and times of day as well as changes in light intensity were taken into account. The flight specification was based on the main parameters with the following assumptions: taking only perpendicular (nadir) RGB photographs, flight altitude 120 m AGL, strip width 160 m, GSD ≤0.04 m and overlap ≥83%. The analysis of the photographic material obtained made it possible to correct the catalog of anomalies defined previously, since the recognition of some objects is very difficult, being usually below the orthophotomap resolution. When making and evaluating orthophotomaps, problems with mapping the shape of objects near the edges of the frame were found. When a 12 mm lens is used, these distortions are significant. It was decided that for the purpose of generating training data from orthophotomaps, only the fragments containing objects which shape would be mapped in accordance with the real one would be used. Thus, the effective width of orthophotomaps obtained from simulated flights will be approximately 100 m.

Keywords: UAVs, drones, orthophotomaps, terrain surface, terrain anomalies

 

 

1. INTRODUCTION

 

Broadly understood photogrammetry is perceived as the main field of UAV application and operation. The utilization of low-level UAV flights for terrain observation typically yields an orthophotomap or a three-dimensional model of an object. However, the results obtained in such a way depend on many conditions and parameters (a well as their values), which causes high diversity of results. The objective of the present research studies was to characterize and evaluate the impact of individual conditions on the recording of terrain anomalies on the photographs obtained during the photogrammetric flight of an UAV. The aforementioned conditions in the research that had an impact on the detail of the images included the season of the year, time of day (morning, noon, afternoon/evening), and the intensity of light dependent on the weather, and in particular on the cloud cover. The situations considered to be dangerous for UAVs [3, 4] such as flights during rain and snowfall, and at sub-zero temperatures were disregarded in the study.

The studies and their scope defined in such a way were used to prepare and test the algorithms of processing images from aerial orthophotomaps in order to detect anomalies found in the gas pipeline's right-of-way (ROW). The objective of photogrammetric UAV flights over selected areas was to map the conditions of photo acquisition similar to those that will ultimately be encountered during inspection flights using a manned helicopter with a hyperspectral camera installed, in the visible light range, and to provide the collected "raw" photos for their final elaboration as orthophotomaps. The flight, however, took place without the geodetic (photogrammetric) control network, without the use of photopoints, nor the use of RTK positioning.

 

 

2. LITERATURE REVIEW

 

The practical application of the Unmanned Aerial Vehicle has become a new, interdisciplinary area of science and research in the last few years. If only due to the fact that unmanned aerial vehicles enable low-altitude photo campaigns and quick acquisition of high-resolution imaging data in various spectral ranges and data from the UAV-borne Laser Scanning. [18, 21, 24]. The term UAV Photogrammetry has even appeared because UAV photogrammetry, was understood as a new tool for photogrammetric measurements, and opened up many new applications in the short-range domain. It combines aerial and ground photogrammetry and introduces low-cost alternatives to classic manned aerial photogrammetry. [7]. Talking about the role of sensors used on unmanned platforms and the factors affecting their performance, they express a similar opinion [1], considering that technological progress has enabled the development of unmanned systems/vehicles and that the scope of application of these systems is constantly growing. However, [9] they believe that imaging using light, unmanned aerial vehicles is one of the fastest growing areas of remote sensing technology. UAVs have already reached the level of reliability and functionality that allowed this technology to enter the market as an additional platform for acquiring spatial data UAV, with the presence of multiple sensors to be mounted on such devices. However, in practice, UAV-based photogrammetry will be accepted if it provides the required accuracy and added value and will be economically competitive with other measurement technologies [20].

Unmanned aerial vehicles have become the standard platforms for photogrammetric data acquisition applications because these systems can be built at reasonable prices and their use becomes cost-effective [8]. They also note [17] that UAV platforms are now a valuable source of data for inspection, observation, mapping and 3D modelling as a low-cost alternative to classic manned aerial photogrammetry. Taking into account the emergence of unmanned aerial vehicles equipped with cheap digital cameras and increasingly better photogrammetric methods for digital mapping, there is a need for effective methods necessary for quick terrain measurements with appropriate accuracy.

The quality and accuracy of the obtained results depend on: the UAV system (UAV platforms and camera), flight planning and image acquisition (flight altitude, image overlap, UAV speed, flight line orientation, camera configuration and georeferencing), photogrammetric model generation (software, image alignment, dense point cloud generation and ground filtering) and geomorphology and land use/cover [10]. The issues of geometric accuracy analysis and calibration of sensors have already been pointed out by [5], and technical aspects were discussed, among others, by [22]. The fact that unmanned aerial vehicles are a new frontier in a wide range of monitoring and research applications was also pointed out by [16], who wrote that in order to fully use their potential, the key challenge is to plan a mission for effective data acquisition in complex environments. The fact that the accuracy depends to a large extent on the resolution of the ground (flight altitude) was also mentioned by [11]. It should also be noted that data collection under highly variable weather and lighting conditions throughout the year is necessary for many UAV imaging system applications and, as described by [19], is a new feature in rigorous photogrammetric processing and remote sensing. It should also be noted that data collection under highly variable weather and lighting conditions throughout the year is necessary for many UAV imaging system applications and, as described by [19], is a new feature in rigorous photogrammetric processing and remote sensing.

In today's fast-moving world, the need for accurate, up-to-date data is increasing for various reasons. In practice, according to [15], it can be a relief of buildings in cities, forestry, and agriculture or infrastructure. In practice, according to [15], it can be a relief of buildings in cities, forestry, and agriculture or infrastructure. The possibilities of photogrammetry have greatly increased with the introduction of digital aerial cameras and digital technologies. For the cadastral registration of objects, i.e., plot boundaries and outlines of buildings, high-resolution aerial photographs are now commonly used as alternative data sources [12]. Low cost unmanned aerial systems technologies, in terms of their ability to map, are also used in the case of semi-development areas / semi-built-up areas, in terms of mapping, agriculture and surveillance [23].

The application of geospatial techniques is noticeable in precision farming, e.g., to identify changes in the terrain. The high resolution of the photos is of particular importance in relation to changes in crop and soil conditions [26]. Similar opinions have [14] in relation to hydraulic modelling or [13] to modelling the productivity of ecosystems by determining the Leaf Area Index (LAI).

Unmanned aerial photogrammetric measurements are increasingly used to map and study natural hazards or areas where human entry is considered potentially dangerous and inadvisable [25]. Examples of the use of remote sensing methods to assess the potential of tourism and recreation of lakes with the use of unmanned aerial vehicles as a tool that gives new measurement opportunities in difficult areas for research as systems of rivers and lakes were presented by [2]. Documentation of hiking trails in the Alpine terrain with the use of UAVs was presented by [6].

 

 

3. METHOD

 

The research studies were conducted over one full calendar year, from March 2022 to March 2023, implementing a fractional plan (of experiments) which covered, as it has already been mentioned, different seasons, times of day, and illuminance (the amount of light). Illuminance values were divided into, low (< 1000 lux), medium (1000-25000 lux), and high (>25000 lux). Each time before the flight, illuminance was measured using the Voltcraft LX-10 lux meter with a measurement range of 0 – 199900 lx (Fig. 1).

 

 

Fig. 1. LX-10 luxmeter

 

The plan of experiments assumed seven flight routes, which were diversified in terms of the presence of terrain anomalies (Tab. 1).

 

                                                                                                                                      Tab. 1

Catalog of anomalies

 

Category ID

Category

Class ID

Class

Description

1

Structures

1.1

Buildings

e.g., house, garage, shed, carport, container

1.2

Fences

e.g., wire mesh, wall

1.3

Landfill sites

containing e.g., loose materials, scrap, tires

1.4

Parking lots

 

1.5

Masts, towers, poles

 

2

Land areas

2.1

Heap

e.g., point, longitudinal

2.2

Excavation

e.g., point, longitudinal

2.3

Water body

 

3

Vehicles

3.1

Vehicles

e.g., passenger cars, vans, trucks, semi-trucks, construction vehicles: excavators, bulldozers, road rollers, tippers

4

Vegetation

4.1

Trees and bushes

 

4.2

Forest areas

 

4.3

Burned-out grass

 

5

Other

5.1

 

All except ID 1-4

 

Each month, a single flight was planned to cover each of the seven routes, which gave a total of 84 flights throughout the year (time/period of the study). The routes were selected in Southern Poland (Fig. 2), between the airport zones of the Gliwice Aeroclub (Gliwice, EPGL) and the Silesian Aeroclub (Katowice, EPKM). Their diversity regarding anomalies occurring on them is presented in Table 2. Routes 1, 2, and 3 were situated in Gierałtowice, routes 3, 4, and 6 in Chudów, and route 7 in Ruda Śląska-Halemba.

 

 

Fig. 2. Location of photogrammetric flight routes

 

                                                                                                                      Tab. 2

Anomalies found in photogrammetric flight routes

 

Route

Anomalies per Tab. 1

1

1.1, 1.2, 1.4, 1.5, 3.1, 4.1

2

1.1, 1.2, 1.3, 1.5, 2.1, 2.2, 3.1, 4.1, 4.3

3

2.1, 2.2, 2.3, 4.1, 4.3

4

1.1, 1.2, 1.3, 2.1, 2.2, 2.3, 3.1, 4.1, 4.2

5

1.1, 1.2, 2.1, 2.2, 3.1, 4.1, 4.2

6

1.1, 1.2, 2.1, 2.2, 3.1, 4.1, 4.3

7

1.3, 2.3, 4.1, 4.2, 4.3

 

Sample route has been presented in Figure 3.

 

Fig. 3. Location of photogrammetric flight routes

 

Selection of the flight route locations, dictated by the presence of particular anomalies, was developed in Google Earth. After saving the project, files in the.kml format were exported to a DJI Cristal Sky monitor, with a DJI Pilot application installed, which was then used to conduct the flights of a DJI Matrice 210 V2 UAV (Fig. 4).

 

 

Fig. 4. DJI Matrice 210 V2 UAV with a DJI Zenmuse X5S camera and an Olympus M. Zuiko 12 mm lens installed (left) and DJI Cristal Sky – Route 7 visible (right)

 

A single photogrammetric flight was a one-way flight along the route to form a strip of imagery, without a return flight. The length of the strip of observation was assumed to be minimum 500 m, while its width was not less than 160 m, while maintaining the pixel size on the ground. The analysis was conducted based on the values determined using the calculator: https://www.scantips.com/lights/fieldofview.html#top.

In order to simulate a flight of a manned helicopter over selected areas, all the radii of the curves en route were rounded off and the sharp turns were minimized (the issue was consulted with helicopter pilots). The studies assumed the Ground Sampling Distance (GSD) to be no worse than 4 cm (Fig. 5). GSD parameters were validated using the calculator: https://www.handalselaras.com/calculator/ (Fig. 6)

Maintaining such precise values of quality parameters required the flight altitude of 120 m AGL (Tab. 3). The assumed Overlap, not worse than 85%, defined the flight speed as 3.5 m/s.

 

 

Fig. 5. The footprint width/distance covered on the ground by one image in width direction (Dw) – symbols explained in Table 3

 

 

Fig. 6. Sample screenshot of the GSD calculator

 

                                                                                                                                         Tab. 3

Ground Sampling Distance (GSD) calculation

 

Parameter symbol

Parameter name

Parameter value

Sw

the sensor width of the camera (millimeters)

17.3

FR

the sensor width of the camera (millimeters)

12

H

flight height (meters)

120

imW

image width (pixels)

5280

imH

image height (pixels)

3956

GSD

Ground Sampling Distance (centimeters/pixel)

3.28

Dw

width of single image footprint on the ground (meters)

173

DH

height of single image footprint on the ground (meters)

130

 

Requirements concerning weather conditions of the flights were also specified, and in this approach, flights were conducted in the absence of precipitation. A flight was planned if the risk of precipitation was less than 50%, with no or moderate cloud cover, with no or weak wind, and when the sun disc was visible clearly above the horizon. These conditions were checked each time, using both the https://awiacja.imgw.pl/ website, and dedicated applications, e.g., UAV Forecast, Airdata UAV or Meteo IMGW (Fig. 7). The most significant factor in determining whether to conduct a photogrammetric flight (or a series of flights) on a specific day was the favorable values of weather parameters.

 

 

Fig. 7. Sample readings of weather applications before conducting a flight along Route 4: UAV Forecast (left) and AirData UAV (right)

 

RGB photos for orthophotomaps were taken from the nadir perspective (only directly overhead), in *.raw (*.dng) and *.jpg format, using a DJI Zenmuse X5S sensor (camera) carried by the UAV, with an Olympus M.Zuiko 12mm lens f/2.0 (Figure 4), which parameters were selected from: https://www.dxomark.com/Cameras/DJI/Zenmuse-X5S---Specifications.

Each time, flight logs were analysed in the Airdata UAV application (Fig. 8).

 

 

4. RESULTS

 

The scope and duration of the tests, the categories of selected anomalies, as well as the planned routes, resulted in the correction of the assumed activities.

The plan of flights was carried out in full (27 cases) only in the case of Route 1 (Fig. 9), while for the remaining six routes (Route 2 – Route 7) flights were conducted only for nine selected cases (Fig. 10), giving the total of 81 flights.

 

 

Fig. 8. Sample logs from AirData application after completing a flight along Route 4: general data – overview (up left), general data – equipment (up right), weather ground – weather (down left); weather, – inflight wind (down right)

 

The length of the designated routes ranged from 629 m (Route 1) to 876 m (Route 4), which met the assumed length of the observation strip. Remaining route lengths: Route 2 – 656 m, Route 3 – 859 m, Route 5 – 685 m, Route 6 – 836 m, and Route 7 – 633 m.

Sample photogrammetric images from flights conducted within the scope of the fractional plan of flights (Fig. 11).

The analysis of the photographs was started with reading the metadata (exif) from the *.dng and *.jpg files.  *.dng files contain photographs in *.raw format, while *.jpg files contain standard rgb images. Rgb images with metadata can be read both from *.dng and *.jpg files. Logs in *.csv format were also downloaded from the AirData service (Fig. 12).

The photographic material recorded was used as a basis for generating an orthophotomap (Fig. 13).

 

 

5. DISCUSSION

 

Considering the results obtained, it was possible to correct the catalog of anomalies (Tab. 1).

Class 1.2 (Fences) was removed because for the overhead view, as is the case with orthophotomaps, recognition of objects whose 2D view dimensions are below the resolving power of the camera system is very difficult.

 

 

Fig. 9. Card of all planned flights for Route 1 with a sample entry

 

 

 

Fig. 10. Sample card of fractional plan of flights for Route 5 with a sample entry

 

 

Fig. 11. Sample photogrammetric images from selected routes and the implementation of the fractional plan of flights: Spring, morning, Route 4 (left) and Spring, afternoon, Route 1 (right)

 

 

 

Fig. 12. A fragment of the AirData application log – a *.csv file (header and a sample data line)

 

 

 

Fig. 13. Sample orthophotomap (Route 2) and distortions of real objects resulting from the use of a short-focus lens

 

For Class 1.5 (Masts, Towers, Poles), the situation is similar because the detection of such objects requires an additional perspective – a single-point dimension of an object such as a mast, tower, or pole in 2D (overhead view) is usually below the resolving power of an orthophotomap.

The same problems as mentioned above were also observed in the case of Class 4.2 (Forest areas). Originally, Class 4.2 was to simplify the detection in the locations with groups of trees but, finally, a common Class 4.1 (Trees and Bushes) was adopted, where, considering the ground surfaces, single trees or their groups are distinguished. There is an ongoing discussion concerning the minimum detectable size of an object considered to be an anomaly, as the large diversity of shapes makes classification very difficult.

In the case of landfills (Class 1.3), collecting training data is problematic because of the diversity of such objects, since the appearance of loose materials resembles that of earth heaps. Analyzing heaps (Class 2.1) and excavations (Class 2.2) based on the basis of an orthophotomap is problematic under certain conditions. In this case, it is necessary to determine the minimum sizes of objects (which constitute an anomaly) belonging to these classes, due to the high probability of confusing them with loose material storage sites, as mentioned above.

In the case of water bodies (Class 2.3), divided into natural and man-made, it was difficult to distinguish between those types.

Due to a very large variety of vehicles, the authors concentrated on one, general, Class 3.1 (Vehicle), but it cannot be ruled out that in the future this class will be extended, and individual types of vehicles will be distinguished in accordance with Table 1.

Another observation concerns weather conditions, whose changes are only seemingly predictable, which is shown by the differences presented in Figure 4 (conditions before a flight and making the decision to conduct a flight) and Figure 5 (in-flight conditions found in the post-flight logs).

According to the assumptions, orthophotomaps were generated from single flights. Unfortunately, this causes problems with mapping the shapes of objects which are at the edges of the frame. When a lens with a focal length of 12 mm is used, these distortions are significant, as shown in the blow-up of a fragment of the orthophotomap (Fig. 13). These distortions can be avoided by conducting multiple flights over the same area in order to obtain more images. This, however, is in opposition to the assumptions of conducting a single flight along the pipeline. It should also be noted that a lens with a focal length of 25 mm will be used. Flights will be performed at an altitude of 300 m, the effect of which will be less distortion of objects situated at the edge of the frame. It was decided that for the purpose of generating training data from orthophotomaps, only the fragments containing objects whose shape would be mapped in accordance with the real ones would be used. Thus, the effective width of orthophotomaps obtained from simulated flights will be approximately 100 m (Fig. 14).

 

 

6. CONCLUSION

 

The research studies confirm the fact that conducting a series of autonomous (automatic) photogrammetric UAV flights with constant (unchanged) parameters (altitude, flight speed, camera tilt), along strictly defined (designed) routes, makes it possible to collect photogrammetric material, sufficient to build a training database, considering weather and lighting conditions.

Not all ground objects (e.g., single-point or linear) can be included in the catalog of anomalies due to their specific properties, visible (or rather invisible) during a nadir flight, which narrows down the content and volume of the catalog.

Image distortions and errors in mapping the shape of objects can be to some extent compensated by conducting multiple flights along the same route (multiplication of the amount of data and the number of images), as well as by selecting the appropriate lens, focal length, and flight altitude.

 

 

Fig. 14. Determining the useful area for analyzing the detection and classification of anomalies (Route 2)

 

 

Acknowledges

 

The Authors would like to acknowledge that the research leading to results described in the paper has been co-financed by the European Union from the European Regional Development Fund under the Intelligent Development Program and the Gas Transmission Operator GAZ-SYSTEM S.A. The project is carried out under the competition of the National Centre for Research and Development: 4/4.1.1/2019 as part of the INGA joint venture.

References

 

1.             Balestrieri Eulalia, Pasquale Daponte, Luca De Vito, Francesco Lamonaca. 2021. „Sensors and Measurements for Unmanned Systems: An Overview”. Sensors 21: 1518. DOI: https://doi.org/10.3390/s21041518.

2.             Borkowski Grzegorz, Adam Młynarczyk. 2019. „Remote sensing using unmanned aerial vehicles for tourist-recreation lake valuation and development”. Quaestiones Geographicae 38(1): 5-14. ISSN 0137-477X, ISSN 2081-6383. DOI: 10.2478/quageo-2019-0012.

3.             Commission Delegated Regulation (EU) 2019/945 of 12 March 2019 on unmanned aircraft systems and on third-country operators of unmanned aircraft systems.

4.             Commission Implementing Regulation (EU) 2019/947 of 24 May 2019 on the rules and procedures for the operation of unmanned aircraft.

5.             Cramer Michael, Norbert Haala. 2010: „DGPF Project: Evaluation of digital photogrammetric aerial-based Imaging Systems – Overview and Results from the Pilot Centre”. Photogrammetric Engineering and Remote Sensing 76(9): 1019-1029. ISSN: 0099-1112.

6.             Ćwiąkała Paweł, Rafał Kocierz, Edyta Puniach, Michał Nędzka, Karolina Mamczarz, Witold Niewiem, Paweł Wiącek. 2018. „Assessment of the Possibility of Using Unmanned Aerial Vehicles (UAVs) for the Documentation of Hiking Trails in Alpine Areas”. Sensors 18: 81. DOI:10.3390/s18010081.

7.             Eisenbeiss Henri. 2009. „UAV photogrammetry”. PhD thesis, Zurich, Switzerland: ETH Zurich.

8.             Haala Norbert, Michael Cramer, Florian Weimer, Martin Trittler. 2011. „Performance test on UAV-based photogrammetric data collection”. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XXXVIII-1/C22: 7-12. DOI: 10.5194/isprsarchives-XXXVIII-1-C22-7-2011.

9.             Honkavaara Eija, Heikki Saari, Jere Kaivosoja, Ilkka Pölönen, Teemu Hakala, Paula Litkey, Jussi Mäkynen, Liisa Pesonen. 2013. „Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture”. Remote Sensing 5: 5006-5039. ISSN 2072-4292. DOI: 10.3390/rs5105006.

10.         Jiménez-Jiménez Sergio Iván, Waldo Ojeda-Bustamante, Mariana de Jesús Marcial-Pablo, Juan Enciso. 2021. „Digital Terrain Models Generated with Low-Cost UAV Photogrammetry: Methodology and Accuracy”. International Journal of Geo-Information 10: 3-4. DOI: https://doi.org/10.3390/ijgi10050285.

11.         Küng Olivier, Christoph Strecha, Antoine Beyeler, Jean-Christophe Zufferey, Dario Floreano, Pascal Fua, Francois Gervaix. 2011. „The accuracy of automatic photogrammetric techniques on ultra-light UAV imagery”. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XXXVIII-1/C22: 125-130. DOI: 10.5194/isprsarchives-XXXVIII-1-C22-125-2011.

12.         Kurczyński Zdzisław, Krzysztof Bakuła, Marcin Karabin, Michał Kowalczyk, Jakub Stefan Markiewicz, Wojciech Ostrowski, Piotr Podlasiak, Dorota Zawieska. 2016. „The possibility of using images obtained from the UAS in cadastral works. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLI-B1: 909-915. DOI: 10.5194/isprs-archives-XLI-B1-909-2016.

13.         Li Minhui, Redmond Ramin Shamshiri, Michael Schirrmann, Cornelia Weltzien, Sanaz Shafian, Morten Stigaard Laursen. 2022. „UAV Oblique Imagery with an Adaptive Micro-Terrain Model for Estimation of Leaf Area Index and Height of Maize Canopy from 3D Point Clouds. Remote Sensing 14: 585. DOI: https://doi.org/10.3390/rs14030585.

14.         Mazzoleni Maurizio, Paolo Paron, Andrea Reali, Dinis Juizo, Josè Manane, Luigia Brandimarte. 2020. „Testing UAV-derived topography for hydraulic modelling in a tropical environment”. Natural Hazards 103: 139-163. DOI: https://doi.org/10.1007/s11069-020-03963-4.

15.         Pecho Pavol, Iveta Škvareková, Villiam Aţaltovič, Martin Bugaj. 2019. „UAV usage in the process of creating 3D maps by RGB spectrum”. Transportation Research Procedia 43: 328-333. DOI: 10.1016/j.trpro.2019.12.048.

16.         Popović Marija, Teresa Vidal-Calleja, Gregory Hitz, Jen Jen Chung, Inkyu Sa, Roland Siegwart, Juan Nieto. 2020. „An informative path planning framework for UAV-based terrain monitoring”. Autonomous Robots 44: 889-911. DOI: https://doi.org/10.1007/s10514-020-09903-2.

17.         Remondino Fabio, Luigi Barazzetti, Francesco Nex, Marco Scaoioni, Daniele Sarazzi. 2011. „UAV photogrammetry for mapping and 3d modelling-current status and future perspetives”. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XXXVIII-1/C22: 25-29. DOI: 10.5194/isprsarchives-XXXVIII-1-C22-25-2011.

18.         Rochala Zdzisław. 2011. „On board data acquisition system with intelligent transducers for unmanned aerial vehicles”. Archives of Transport 23 (4): 521-529. DOI: 10.2478/v10174-011-0035-4.

19.         Rosnell Tomi, Eija Honkavaara, Kimmo Nurminen. 2011. „On geometric processing of multi-temporal image data collected by light UAV systems”. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XXXVIII-1/C22: 63-68. DOI: 10.5194/isprsarchives-XXXVIII-1-C22-63-2011.

20.         Sauerbier Martin, Emil Siegrist, Henri Eisenbeiss, Nusret Demir. 2011. „The practical application of UAV-based photogrammetry under economic aspects”. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XXXVIII-1/C22: 45-50. DOI: 10.5194/isprsarchives-XXXVIII-1-C22-45-2011.

21.         Sawicki Piotr. 2012. „Unmanned aerial vehicles in photogrammetry and remote sensing – state of the art and trends”. Archiwum Fotogrametrii, Kartografii i Teledetekcji 23: 365-376. ISSN 2083-2214. ISBN 978-83-61576-19-8.

22.         Szczechowski Bogdan. 2008. „The use of unmanned aerial vehicles (mini helicopters) in photogrammetry from low level”. Archiwum Fotogrametrii, Kartografii i Teledetekcji 18: 569-579. ISBN 978-83-61576-08-2.

23.         Tahar Khairul Nizam. 2012. „Aerial terrain mapping using unmanned aerial vehicle approach”. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 29: 493-498. DOI: 10.5194/isprsarchives-XXXIX-B7-493-2012.

24.         Ułanowicz Leszek, Ryszard Sabak. 2021. „Unmanned aerial vehicles supporting imagery intelligence using the structured light technology”. Archives of Transport 58 (2): 35-45. DOI: 10.5604/01.3001.0014.8796.

25.         Žabota Barbara, Milan Kobal. 2021. „Accuracy Assessment of UAV-Photogrammetric-Derived Products Using PPK and GCPs in Challenging Terrains: In Search of Optimized Rockall Mapping”. Remote Sensing 13: 3812. DOI: https://doi.org/10.3390/rs13193812.

26.         Zhang Chunhua, John Michael Kovacs. 2012. „The application of small unmanned aerial systems for precision agriculture: a review”. Precision Agriculture 13: 693-712. DOI: 10.1007/s11119-012-9274-5.

 

 

Received 10.12.2023; accepted in revised form 03.04.2024

 

 

Scientific Journal of Silesian University of Technology. Series Transport is licensed under a Creative Commons Attribution 4.0 International License



[1] Silesian University of Technology, Faculty of Transport and Aviation Engineering, Zygmunta Krasińskiego Str. 8, 40-019 Katowice, Poland. Email: jaroslaw.kozuba@polsl.pl. ORCID: 0000-0003-3394-4270

[2] Silesian University of Technology, Faculty of Transport and Aviation Engineering, Zygmunta Krasińskiego Str. 8, 40-019 Katowice, Poland. Email: marek.marcisz@polsl.pl. ORCID: 0000-0002-8178-880X

[3] Silesian University of Technology, Faculty of Mechanical Engineering, Konarskiego Str. 18A, 40-100 Gliwice, Poland. Email: sebastian.rzydzik@polsl.pl. ORCID: 0000-0003-3352-3986

[4] Silesian University of Technology, Faculty of Automatics, Electronics and Computer Science, Akademicka 16 Street, 44-100 Gliwice, Poland. Email: marcin.paszkuta@polsl.pl. ORCID: 0000-0002-7136-0797