Job Description
We are looking for a DataStage developer to analyze DataStage jobs, identify datasets, mappings and come up with a solution design to migrate in Azure
In this role, you will create and analyze Ingestion frameworks in the DataStage platform to provide Data Discovery Artifacts (for example - BRD, Source-To-Target Mapping).
Responsibilities:
- Reviewing and discussing briefs with key personnel assigned to projects.
- Analyzing DataStage Jobs by going through various ETL components & produce mappings
- Performing Gap Analysis between sources and targets
- Updating documentation.
- Monitoring jobs and identifying bottlenecks in the data processing pipeline.
- Analyze Teradata SQL and BTEQ scripts used in DataStage Jobs
- Analyze Tivoli scheduler jobs invoking DataStage ETL mappings
- Assisting project leaders in determining project timelines and objectives.
- Testing and troubleshooting problems in ETL system designs and processes.
- Improving existing ETL approaches and solutions used by the company.
- Providing support to customers about issues relating to the storage, handling, and access of data.
Requirements:
- Level of Exp: 3-8 yrs.
- Bachelor's degree in computer science, information systems, or a similar field.
- Demonstrable experience as a DataStage developer.
- IBM DataStage certification or similar type of qualification.
- Proficiency in SQL.
- Experience or understanding of other ETL tools, such as Informatica, Oracle ETL
- Knowledge of data modeling, database design, and the data warehousing ecosystem.
- Skilled at the ideation, design, and deployment of DataStage solutions.
- Excellent analytical and problem-solving skills.
- Excellent communication skills
- Experience in analyzing Teradata BTEQ scripts
- Exposure to Azure Cloud.
Found this job inappropriate? Report to us