Geospatial Big Data Engineer
Geospatial Big Data Engineer
The big data engineer will participate in the building of large-scale data processing systems, is an expert in data warehousing solutions and should be able to work with the latest (NoSQL) database technologies.
A big data engineer should embrace the challenge of dealing with petabyte or even exabytes of data daily. A big data engineer understands how to apply technologies to solve big data problems and to develop innovative big data solutions. The big data engineer generally works on implementing complex big data projects with a focus on collecting, parsing, managing, analyzing and visualizing large sets of data to turn information into insights using multiple platforms. The big data engineer should be able to develop prototypes and proof of concepts for the selected solutions. This role will drive the engineering and building of geospatial data assets to support
Key responsibilities include:
* Design, build and support of cloud and open source systems to process geospatial data assets via an API-based platform
* Up to date knowledge of public domain data sets that are relevant to product pipeline
* Partners with data science and commercial communities to brings needed data sets into GIS and Big Data analytical environments
* Integration of key environmental data into field management systems; provides leadership in advancing understanding of environmental influences on field performance and risk factors
* Minimum of 3-year experience with Geospatial Information Systems (GIS) such as OpenGeo Suite, Google Maps, MapBox, or CartoDB.
* Minimum of 3-year experience with Java, Scala, Python, or similar development language.
* Extensive knowledge in different programming or scripting languages like Java, Linux, C++, PHP, Ruby, Python and/or R.
* Experience working with raster and vector data sets applying GDAL and similar spatial libraries
* Experience developing REST style and OGC APIs that serve up geospatial data leveraging GeoServer or other similar open source technologies, preferably in a cloud environment
* Experience in using Geo Server enabled services via JS libraries: Google Maps, OpenLayers
* Proven experience (2 years) with distributed systems, e.g. Mesos, Kubernetes, Spark, Hadoop, Cassandra, distributed databases, grid computing
* Ability to build and maintain modern cloud architecture, e.g. AWS, Google Cloud, etc.
* Experience working with PostgreSQL/PostGIS for processing of both vector and raster data formats such as shp., GeoJSON, GeoHash, GeoTiff, NetCDF, PNG, JPG and others
* Experience with code versioning and dependency management systems such as GitHub, SVT, and Maven
* Experience with stream processing, e.g. Kafka
* Demonstrated knowledge of agriculture and/or agriculture oriented businesses
* Experience implementing complex big data projects with a focus on collecting, parsing, managing, analyzing and visualizing large sets of data to turn information into insights using multiple platforms.
* Demonstrated experience adapting to new technologies
* Capable to decide on the needed hardware and software design needs and act according to the decisions. The big data engineer should be able to develop prototypes and proof of concepts for the selected solutions.
* Experience with object-oriented design, coding and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures should be present.
* Experience creating cloud computing solutions and web applications leveraging public and private API’s