Have a personal or library account? Click to login

Generative Adversarial Approach to Urban Areas’ NDVI Estimation: A Case Study of Łódź, Poland

Open Access
|Jan 2023

References

  1. Adamiak M., Będkowski K., Majchrowska A., 2021. Aerial imagery feature engineering using bidirectional generative adversarial networks: A case study of the Pilica River Region, Poland. Remote Sensing 13(2): 306. DOI 10.3390/rs13020306.
  2. Aslahishahri M., Stanley K.G., Duddu H., Shirtliffe S., Vail S., Bett K., Pozniak C., Stavness I., 2021. From RGB to NIR: Predicting of near infrared reflectance from visible spectrum aerial images of crops. In: 2021 IEEE/CVF International Conference on Computer Vision Workshops (ICCVW), 1312–1322. DOI 10.1109/ICCVW54120.2021.00152.
  3. Bagheri N., Ahmadi H., Alavi Panah S., Omid M., 2013. Multispectral remote sensing for site-specific nitrogen fertilizer management. Pesquisa Agropecuária Brasileira 48: 1394–1401. DOI 10.1590/S0100-204×2013001000011.
  4. Barley A., Town C., 2014. Combinations of feature descriptors for texture image classification. Journal of Data Analysis and Information Processing 2(3): 67–76. DOI 10.4236/jdaip.2014.23009.
  5. Barwiński M., 2009. Spatial development and functional changes in Łódź – Geographic, economic and political conditions. Geografia w szkole 6: 38–50.
  6. Będkowski K., Bielecki A., 2017. Assessment of the availability of greenery in the place of residence in cities using NDVI and the Lorenz's concentration curve. Teledetekcja Środowiska 57: 5–14.
  7. Chai T., Draxler R.R., 2014. Root mean square error (RMSE) or mean absolute error (MAE)? – Arguments against avoiding RMSE in the literature. Geoscientific Model Development 7(3): 1247–1250. DOI 10.5194/gmd-7-1247-2014.
  8. Chew W.C., Hashim M., Lau A.M.S., Battay A.E., Kang C.S., 2014. Early detection of plant disease using close range sensing system for input into digital earth environment. IOP Conference Series: Earth and Environmental Science 18: 012143. DOI 10.1088/1755-1315/18/1/012143.
  9. Chollet F., 2017. Xception: Deep learning with depthwise separable convolutions. arXiv:1610.02357 [cs], April. Online: http://arxiv.org/abs/1610.02357.
  10. Davis C.H., Wang X., 2011. High-resolution DEMS for urban applications from NAPP photography. Photogrammetric Engineering and Remote Sensing 67: 4–11.
  11. Deering D., 1978. Rangeland reflectance characteristics measured by aircraft and spacecraft sensors. Thesis, Texas A&M University. Libraries. Online: https://oaktrust.library.tamu.edu/handle/1969.1/DISSERTATIONS-253780.
  12. Dematteis N., Giordan D., 2021. Comparison of digital image correlation methods and the impact of noise in geoscience applications. Remote Sensing 13(2): 327. DOI 10.3390/rs13020327.
  13. Demir U., Unal G., 2018. Patch-based image inpainting with generative adversarial networks. arXiv:1803.07422 [cs]. Online: http://arxiv.org/abs/1803.07422.
  14. Donahue J., Simonyan K., 2019. Large scale adversarial representation learning. arXiv:1907.02544 [cs, stat]. Online: http://arxiv.org/abs/1907.02544.
  15. Dong J., Yin R., Sun X., Li Q., Yang Y., Qin X., 2019. Inpainting of remote sensing SST images with deep convolutional generative adversarial network. IEEE Geoscience and Remote Sensing Letters 16(2): 173–177. DOI 10.1109/LGRS.2018.2870880.
  16. EnviroSolutions Sp. z o.o. – Michał Włoga., 2021. Pobieracz danych GUGiK. Online: https://plugins.qgis.org/plugins/pobieracz_danych_gugik/.
  17. Geoportal, 2021. Online: http://geoportal.gov.pl.
  18. Gu Y., Brown J., Verdin J., Wardlow B., 2007. A five-year analysis of MODIS NDVI and NDWI for grassland drought assessment over the central great plains of the United States. Geophysical Research Letters 34(6). DOI 10.1029/2006GL029127.
  19. Haralick R.M., Shanmugam K., Dinstein I., 1973. Textural features for image classification. IEEE Transactions on Systems, Man, and Cybernetics SMC 3(6): 610–621. DOI 10.1109/TSMC.1973.4309314.
  20. Hatfield J., Prueger J., 2010. Value of using different vegetative indices to quantify agricultural crop characteristics at different growth stages under varying management practices. Remote Sensing 2(2): 562–578. DOI 10.3390/rs2020562.
  21. Head Office of Geodesy and Cartography., b.d. Integrated copies of databases of topographic objects. Główny Urząd Geodezji i Kartografii. Główny Urząd Geodezji i Kartografii. Online: https://www.geoportal.gov.pl/dane/baza-danych-obiektow-topograficznych-bdot (accessed 11 November 2020)
  22. Head Office of Geodesy and Cartography., b.d. Online: https://www.gov.pl/web/gugik-en (accessed 8 August 2022).
  23. Herold M., Liu X., Clarke K., 2003. Spatial metrics and image texture for mapping urban land use. Photogrammetric Engineering and Remote Sensing 69: 991–1001. DOI 10.14358/PERS.69.9.991.
  24. Horé A., Ziou D., 2010. Image quality metrics: PSNR vs. SSIM. In: 2010 20th International Conference on Pattern Recognition, 2366–2369. DOI 10.1109/ICPR.2010.579.
  25. Hunt E.R., Rock B., 1989. Detection of changes in leaf water content using near – And middle-infrared reflectances. Remote Sensing of Environment 30(1): 43–54. DOI 10.1016/0034-4257(89)90046-1.
  26. Isola P., Zhu J-Y., Zhou T., Efros A., 2017. Image-to-image translation with conditional adversarial networks. arXiv:1611.07004 [cs], November 2021. Online: http://arxiv.org/abs/1611.07004.
  27. Jackson R., Huete A., 1991. Interpreting vegetation indices. Preventive Veterinary Medicine 11(3): 185–200. DOI 10.1016/S0167-5877(05)80004-2.
  28. Jackson T., Chen M., Cosh M., Li F., Anderson M., Walthall C., Doriaswamy P., Ray Hunt R., 2004. Vegetation water content mapping using landsat data derived normalized difference water index for corn and soybeans. Remote Sensing of Environment, 2002 Soil Moisture Experiment (SMEX02), 92(4): 475–482. DOI 10.1016/j.rse.2003.10.021.
  29. Jarocińska A., Zagajewski B., 2008. Correlations of ground – And airborne-level acquired vegetation indices of the Bystrzanka catchment. Teledetekcja Środowiska 40: 100–124.
  30. Jung A., 2022. Imgaug. Python. Online: https://github.com/aleju/imgaug.
  31. Koza P., 2006. Orientation of Ikonos stereo images and automatic acquisition of height models. Archiwum Fotogrametrii, Kartografii i Teledetekcji 16. Online: http://yadda.icm.edu.pl/baztech/element/bwmeta1.element.baztech-3514d2c7-31a9-49d8-ad2d-c35825c950f8.
  32. Krukowski M., 2018. Modelowanie Kartograficzne w Ocenie Jakości Życia w Mieście – Aspekt Zieleni Miejskiej w Lublinie. Annales Universitatis Mariae Curie-Sklodowska, Sectio B – Geographia, Geologia, Mineralogia et Petrographia 73: 7–27. DOI 10.17951/b.2018.73.0.7-27.
  33. Krukowski M., Cebrykow P., Płusa J., 2016. Classification of green areas in Lublin based on satellite images Ikonos 2. Barometr Regionalny 14(2): 35–44.
  34. Książek, J., 2018. Study of selected textural features properties on asbestos roof images. Geomatics and Environmental Engineering 12(4). DOI 10.7494/geom.2018.12.4.45.
  35. Kuang, W., Dou Y., 2020. Investigating the patterns and dynamics of urban green space in China's 70 major cities using satellite remote sensing. Remote Sensing 12(12): 1929. DOI 10.3390/rs12121929.
  36. Kubalska J., Preuss R., 2014. Use of the photogrammetric data for vegetation inventory on urban areas. Archiwum Fotogrametrii, Kartografii i Teledetekcji 26: 75–86. DOI 10.14681/AFKIT.2014.006.
  37. Łachowski W., Łęczek A., 2020. Tereny zielone w dużych miastach Polski. Analiza z wykorzystaniem Sentinel 2. Urban Development Issues 66(1): 77–90. DOI 10.51733/udi.2020.68.07.
  38. Li P., Cheng T., Guo J., 2009. Multivariate image texture by multivariate variogram for multispectral image classification. Photogrammetric Engineering & Remote Sensing 75(2): 147–157. DOI 10.14358/PERS.75.2.147.
  39. Li X., Ratti C., 2018. Mapping the spatial distribution of shade provision of street trees in Boston using google street view Panoramas. Urban Forestry & Urban Greening 31: 109–119. DOI 10.1016/j.ufug.2018.02.013.
  40. Marmol U., Lenda G., 2010. Texture filters in the process of automatic object classification. Archiwum Fotogrametrii, Kartografii i Teledetekcji 21: 235–243.
  41. McPherson G., Xiao Q., van Doorn N., Johnson N., Albers S., Peper P., 2018. Shade factors for 149 taxa of in-leaf urban trees in the USA. Urban Forestry & Urban Greening 31: 204–211. DOI 10.1016/j.ufug.2018.03.001.
  42. Mirza M., Osindero S., 2014. Conditional generative adversarial nets. arXiv:1411.1784 [cs, stat]. Online: http://arxiv.org/abs/1411.1784.
  43. Müller M., Ekhtiari N., Almeida R., Rieke C., 2020. Super-resolution of multispectral satellite images using convolutional neural networks. In: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences V-1–2020 (August): 33–40. DOI 10.5194/isprs-annals-V-1-2020-33-2020.
  44. Myeong S., Nowak D., Hopkins P., Brock R., 2003. Urban cover mapping using digital, high-resolution aerial imagery. Urban Ecosystems 5: 243–256. Online: http://www.fs.usda.gov/treesearch/pubs/18820.
  45. Nowak D., Greenfield E., 2012. Tree and impervious cover change in U.S. cities. Urban Forestry & Urban Greening 11(1): 21–30. DOI 10.1016/j.ufug.2011.11.005.
  46. NumPy documentation. Online: https://numpy.org/doc/stable/reference/generated/numpy.savez.html (accessed 27 January 2022).
  47. OpenCV., b.d. Online: https://opencv.org/ (accessed 27 January 2022)
  48. Pluto-Kossakowska J., Władyka M., Tulkowska W., 2018. Assessment of remote sensing image data to identify objects in green and blue infrastructure. Teledetekcja Środowiska T 59. Online: http://yadda.icm.edu.pl/baztech/element/bwmeta1.element.baztech-9632f302-e255-497e-a9dd-368ea620f9b4.
  49. Pyra M., Adamczyk J., 2018. Object-oriented classification in the inventory of green infrastructure objects on the example of the Ursynów District in Warsaw. Teledetekcja Środowiska T. 59. Online: http://yadda.icm.edu.pl/baztech/element/bwmeta1.element.baztech-8bd759f8-2ab3-4b35-946d-b34b73f28b88.
  50. Rouse J.W., Jr., Haas R.H., Schell J.A., Deering D.W., 1973. Monitoring the vernal advancement and retrogradation (green wave effect) of natural vegetation. Texas A&M Univ. College Station, TX, United States.
  51. Salimans T., Goodfellow I., Zaremba W., Cheung V., Radford A., Chen X., 2016. Improved techniques for training GANs. arXiv:1606.03498 [cs]. Online: http://arxiv.org/abs/1606.03498.
  52. Scikit-learn: Machine learning in Python – Scikit-learn 1.0.2 documentation., b.d. Online: https://scikit-learn.org/stable/ (accessed 27 January 2022)
  53. Small, C., 2001. Estimation of urban vegetation abundance by spectral mixture analysis. International Journal of Remote Sensing 22(7): 1305–1334. DOI 10.1080/01431160151144369.
  54. Statistics Poland., 2020. Statistics of Łódź 2020. Lodz.Stat.Gov.Pl. Online: https://lodz.stat.gov.pl/en/publications/statistical-yearbook/statistics-of-lodz-2020,1,16.html.
  55. Suarez P., Sappa A., Vintimilla B., 2017. Learning image vegetation index through a conditional generative adversarial network. In: 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM), 1–6. DOI 10.1109/ETCM.2017.8247538.
  56. Suárez P., Sappa A., Vintimilla B., Hammoud R., 2019. Image vegetation index through a cycle generative adversarial network. In: 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 1014–1021. DOI 10.1109/CVPRW.2019.00133.
  57. Sultana S., Ali A., Ahmad A., Mubeen M., Zia-Ul-Haq M., Ahmad S., Ercisli S., Jaafar H., 2014. Normalized difference vegetation index as a tool for wheat yield estimation: A case study from Faisalabad, Pakistan. The Scientific World Journal 2014: e725326. DOI 10.1155/2014/725326.
  58. TensorFlow., (2018). 2022. TensorFlow documentation. Jupyter notebook. Tensorflow. Online: https://github.com/tensorflow/docs/blob/d58904052034c0870678709dc1ee8eb35e2fd34c/site/en/tutorials/generative/pix2pix.ipynb.
  59. TensorFlow Datasets., b.d. Online: https://www.tensorflow.org/datasets (accessed 27 January 2022)
  60. Tomaszewska M., Lewiński S., Woźniak E., 2011. Use of MODIS satellite images to study the percentage of vegetation cover. Teledetekcja Środowiska 46: 15–22.
  61. Tucker C., 1979. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sensing of Environment 8(2): 127–150. DOI 10.1016/0034-4257(79)90013-0.
  62. Turlej K., 2009. Comparison of NDVI index based on NOAA AVHRR, SPOT-VEGETATION and TERRA MODIS satellite data. Teledetekcja Środowiska 42: 83–88.
  63. Tuszynska J., Gatkowska M., Wrobel K., Jagiello K., 2018. A pilot study on determining approximate date of crop harvest on the basis of sentinel-2 satellite imagery. Geoinformation Issues 10(1): 65–77. Online: http://yadda.icm.edu.pl/yadda/element/bwmeta1.element.baztech-46991614-3b5b-429e-892e-b1a2556684c5.
  64. van der Walt S., Schönberger JL., Nunez-Iglesias J., Boulogne F., Warner JD., Yager N., Gouillart E., Yu T., 2014. Scikit-image: Image processing in Python. PeerJ 2: e453. DOI 10.7717/peerj.453.
  65. Verykokou S., Ioannidis C., 2019. A Global Photogrammetry-Based Structure from Motion Framework: Application in Oblique Aerial Images. Conference paper: FIG Working Week 2019: Geospatial information for a smarter life and environmental resilience. Hanoi, Vietnam
  66. Wang Z., Bovik A., Sheikh H., Simoncelli E., 2004. Image quality assessment: From error visibility to structural similarity. IEEE Transactions on Image Processing 13(4): 600–612. DOI 10.1109/TIP.2003.819861.
  67. Worm A., Będkowski K., Bielecki A., 2019. The use of surface and volume indicators from high resolution remote sensing data to assess the vegetation filling of urban quarters in Łódź City Centre, Poland. Teledetekcja Środowiska T. 60. Online: http://yadda.icm.edu.pl/baztech/element/bwmeta1.element.baztech-4a024b76-0072-48be-94a6-ceea9e001322.
  68. Yao G., Yilmaz A., Zhang L., Meng F., Ai H., Jin F., 2021. Matching large baseline oblique stereo images using an end-to-end convolutional neural network. Remote Sensing 13(2): 274. DOI 10.3390/rs13020274.
  69. Zhang Y., 2001. Texture-integrated classification of urban treed areas in high-resolution color-infrared imagery. Photogrammetric Engineering & Remote Sensing 67(12): 1359–1365.
  70. Zhou S., Gordon M., Krishna R., Narcomey A., Fei-Fei L., Bernstein M., 2019. HYPE: a benchmark for human eYe perceptual evaluation of generative models. arXiv:1904.01121 [cs]. Online: http://arxiv.org/abs/1904.01121.
  71. Zięba-Kulawik K., Hawryło P., Wężyk P., Matczak P., Przewoźna P., Inglot A., Mączka K., 2021. Improving methods to calculate the loss of ecosystem services provided by urban trees using LiDAR and aerial orthophotos. Urban Forestry & Urban Greening 63(sierpień): 127195. DOI 10.1016/j.ufug.2021.127195.
  72. Zięba-Kulawik K., Wężyk P., 2022. Monitoring 3D changes in urban forests using landscape metrics analyses based on multi-temporal remote sensing data. Land 11(6): 883. DOI 10.3390/land11060883.
DOI: https://doi.org/10.14746/quageo-2023-0007 | Journal eISSN: 2081-6383 | Journal ISSN: 2082-2103
Language: English
Page range: 87 - 106
Submitted on: Sep 27, 2022
Published on: Jan 29, 2023
Published by: Adam Mickiewicz University
In partnership with: Paradigm Publishing Services
Publication frequency: 4 times per year
Related subjects:

© 2023 Maciej Adamiak, Krzysztof Będkowski, Adam Bielecki, published by Adam Mickiewicz University
This work is licensed under the Creative Commons Attribution 4.0 License.