A. Job Profile Template
Job Purpose
Our railway infrastructure, including tracks, tunnels, and viaducts, undergoes routine inspection and monitoring to ensure the highest standards of safety and reliability. We leverage vast amounts of data from laser measurement systems, photo imaging, IoT sensors, and maintenance records to identify defect trends, drive proactive maintenance, and continuously improve our network.
This role is central to our data-driven maintenance strategy, focusing on Data Engineering and Power BI to deliver critical insights and decision support for our core operations. You will take ownership of existing data solutions, enhance their reliability and performance, and build new data pipelines and dashboards that empower engineers to act faster and with greater confidence.
Your work will directly influence preventive, condition-based, and predictive maintenance regimes. Furthermore, you will design simulations and what-if analyses to optimize spares provisioning and workforce planning, translating operational feedback into tangible improvements.
Responsibilities
As a Senior Engineer, you will assist the Section Manager and be responsible for the following:
Data Engineering & Pipeline Management
- Design, implement, and operate production-grade data pipelines on the Azure platform (ADF, Synapse/Fabric, ADLS, Functions).
- Develop robust ETL/ELT scripts in Python and SQL, ensuring strong data quality controls, schema validation, and SLA monitoring.
- Take full ownership of existing data solutions (including CDE integrations, SharePoint/REST ingestions, and scheduled jobs), maintaining comprehensive documentation, runbooks, and recovery procedures.
Business Intelligence & Analytics
- Enhance and manage Power BI data models and dashboards, providing timely and accurate information to engineering and operational teams.
- Master Power BI development, including writing complex DAX measures, optimizing Power Query M, implementing RLS/OLS, and managing incremental refresh and data gateways.
- Establish rigorous monitoring, alerting, and capacity planning for all datasets and pipelines to achieve ≥99% on-time refresh for Tier-1 reports.
- Collaborate with Railway Infrastructure Engineers to support fault diagnostics and mine reliability trends to inform maintenance programs.
Operational Integration & Project Management
- Translate operational needs into actionable dashboards, alerts, and self-service analytics tools.
- Work closely with operations teams to understand their challenges, with future opportunities for cross-training and shadowing to gain deep domain expertise.
- Manage projects end-to-end including requirements gathering, design, build, testing, and handover with clear timelines and stakeholder communication.
Team & Technology Leadership
- Collaborate with various teams to align initiatives with sustainability and compliance goals and research emerging maintenance and green technologies.
- Support the implementation of sustainability projects by developing data models and dashboards to monitor performance metrics.
- Coach and mentor interns and junior engineers, leading code reviews and helping to improve internal standards and developer ergonomics.
Qualifications & Work Experience
- A degree in Engineering (Mechanical, Electrical, Civil), Computer Science, Information Systems, or equivalent professional experience.
- 3–6 years of hands-on experience in data engineering or analytics engineering, with proven ownership of production pipelines and BI models.
- Prior experience in maintenance, rail, asset-intensive, or other industrial operations is strongly preferred.
Skills
Technical Proficiency
- SQL: Advanced proficiency, including performance tuning, partitioning, indexing, and complex queries.
- Data Modelling: Strong understanding of dimensional modelling concepts like star schemas and slowly changing dimensions (SCDs).
- Python: Proficient in Python for data engineering, including libraries such as pandas, PySpark, and data quality frameworks.
- Power BI: Expert-level skills in DAX, Power Query (M), composite models, incremental refresh, RLS/OLS, deployment pipelines, and gateway administration.
- Azure Cloud: Comfortable working with Azure data services, including Azure Data Factory (ADF), Synapse/Fabric (SQL Warehouse or Lakehouse), ADLS, Key Vault, and Azure Functions/containers.
- DevOps: Experience with version control (Git) and CI/CD principles (e.g., YAML pipelines).
Professional Competencies
- Excellent documentation and communication skills, with the ability to create clear diagrams, runbooks, and concise status updates.
- Proficiency in Microsoft Excel for ad-hoc analysis and stakeholder reporting.
Certifications (Bonus)
- Microsoft Certified: Azure Data Engineer Associate
- Microsoft Certified: Power BI Data Analyst Associate
- Microsoft Certified: Azure Data Scientist Associate