Project focus: Data extraction from BIM models (Construction related SW, Revit is the main one), consolidation in datalakes/datawarehouse -> ETL process to be consumed by other services/products.
Basic Requirements:
1. Bachelor’s Degree in Computer Science, Data Science, Information Technology, or a related field.
2. Proficiency in programming languages such as Python, Java, or Scala, and data processing frameworks like Apache Spark or Hadoop.
3. Experience with database systems (SQL and NoSQL) and data warehousing solutions (e.g., AWS Redshift, Google BigQuery).
Responsibilities:
1. Building Data Pipelines: Design, develop, and maintain scalable data pipelines for extracting, transforming, and loading data.
2. Data Integration: Integrate data from various sources into a centralized data warehouse or data lake.
3. Data Quality and Governance: Implement data quality checks and ensure compliance with data governance policies.
More context: Project in early staged, client will be conducting a workshop in Oct to finalize details with the end client (Singapore based).
The approved candidate will need to be OK with acting as a on demand consultant for the month of october and transition to a full time position on November.
Candidates can be located in LATAM or Europe