Work With Us

The field of Data Science is here to stay and grow. Are you an experienced data professional looking to work on enterprise-scale projects, or looking to start your career in this exciting industry ? Datalens is the perfect place for you.

JOB DESCRIPTION:

Employment Type: Full Time, Permanent
Location: Hyderabad, Singapore
No. of Positions: 1
Joining: Immediate

Key Responsibilities & Duties:
  • Work as part of a team to develop Cloud Data and Analytics solutions
  • Participate in development of cloud data warehouses, data as a service, business intelligence solutions.
  • Ability to provide solutions that are forward-thinking in data and analytics
  • Developing Modern Data Warehouse solutions.
  • Design, develop and build data integration pipeline architecture and ensure successful creation of the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Spark, SQL and other technologies.
  • Expertise in writing complex, highly optimized queries across large datasets
  • Code deployments, Code Reviews
  • Develop and maintain design documentation, test cases, performance and monitoring and performance evaluation using GIT, Confluence, Cluster Manager
Key Skills & Requirements:
  • 4 – 5 years of experience in a diversified IT and minimum 3+ years of relevant experience in Databricks.
  • Hands-on experience in Python/Pyspark/Scala/Hive Programming to develop ETL pipelines.
  • Working experience in ETL testing including Master Data Management, Data Completeness Data Cataloging, Data Leverage, and Data Quality for various data feeds coming from the source.
  • Hands-on experience in creating Stored Procedures, Functions, Tables, Cursors, and Hashtables and handling large data types.
  • Experience in database testing, data comparison & data transformation scripting.
  • Good knowledge of Data Warehousing concepts & Data Modelling.
  • Should have experience working on complex data architecture.
  • Communicate progress and other relevant information the project stakeholders.
  • Proficient in a source code control system such as GIT.
  • Experience with building CI/CD pipelines in Data environments.
  • Excellent communication and documentation skills.
  • Should have worked in the onsite/offshore model earlier.

Add Ons: Any AWS/AZURE and Databricks Certifications

Kindly share your CVs at talent.datalensai.com

JOB DESCRIPTION:

Employment Type: Full Time, Permanent
Location: Hyderabad, Singapore
No. of Positions: 1
Joining: Immediate

Key Responsibilities & Duties:
  • Work as part of a team to develop Cloud Data and Analytics solutions
  • Participate in development of cloud data warehouses, data as a service, business intelligence solutions.
  • Ability to provide solutions that are forward-thinking in data and analytics
  • Developing Modern Data Warehouse solutions.
  • Design, develop and build data integration pipeline architecture and ensure successful creation of the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Spark, SQL and other technologies.
  • Expertise in writing complex, highly optimized queries across large datasets
  • Code deployments, Code Reviews
  • Develop and maintain design documentation, test cases, performance and monitoring and performance evaluation using GIT, Confluence, Cluster Manager
Key Skills & Requirements:
  • 4 – 5 years of experience in a diversified IT and minimum 3+ years of relevant experience in Databricks.
  • Hands-on experience in Python/Pyspark/Scala/Hive Programming to develop ETL pipelines.
  • Working experience in ETL testing including Master Data Management, Data Completeness Data Cataloging, Data Leverage, and Data Quality for various data feeds coming from the source.
  • Hands-on experience in creating Stored Procedures, Functions, Tables, Cursors, and Hashtables and handling large data types.
  • Experience in database testing, data comparison & data transformation scripting.
  • Good knowledge of Data Warehousing concepts & Data Modelling.
  • Should have experience working on complex data architecture.
  • Communicate progress and other relevant information the project stakeholders.
  • Proficient in a source code control system such as GIT.
  • Experience with building CI/CD pipelines in Data environments.
  • Excellent communication and documentation skills.
  • Should have worked in the onsite/offshore model earlier.

Add Ons: Any AWS/AZURE and Databricks Certifications

Kindly share your CVs at talent.datalensai.com

JOB DESCRIPTION:

Employment Type: Full Time, Permanent
Location: Hyderabad, Singapore
No. of Positions: 1
Joining: Immediate

Key Responsibilities & Duties:
  • Work as part of a team to develop Cloud Data and Analytics solutions
  • Participate in development of cloud data warehouses, data as a service, business intelligence solutions.
  • Ability to provide solutions that are forward-thinking in data and analytics
  • Developing Modern Data Warehouse solutions.
  • Design, develop and build data integration pipeline architecture and ensure successful creation of the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Spark, SQL and other technologies.
  • Expertise in writing complex, highly optimized queries across large datasets
  • Code deployments, Code Reviews
  • Develop and maintain design documentation, test cases, performance and monitoring and performance evaluation using GIT, Confluence, Cluster Manager
Key Skills & Requirements:
  • 4 – 5 years of experience in a diversified IT and minimum 3+ years of relevant experience in Databricks.
  • Hands-on experience in Python/Pyspark/Scala/Hive Programming to develop ETL pipelines.
  • Working experience in ETL testing including Master Data Management, Data Completeness Data Cataloging, Data Leverage, and Data Quality for various data feeds coming from the source.
  • Hands-on experience in creating Stored Procedures, Functions, Tables, Cursors, and Hashtables and handling large data types.
  • Experience in database testing, data comparison & data transformation scripting.
  • Good knowledge of Data Warehousing concepts & Data Modelling.
  • Should have experience working on complex data architecture.
  • Communicate progress and other relevant information the project stakeholders.
  • Proficient in a source code control system such as GIT.
  • Experience with building CI/CD pipelines in Data environments.
  • Excellent communication and documentation skills.
  • Should have worked in the onsite/offshore model earlier.

Add Ons: Any AWS/AZURE and Databricks Certifications

Kindly share your CVs at talent.datalensai.com

JOB DESCRIPTION:

Employment Type: Full Time, Permanent
Location: Hyderabad, Singapore
No. of Positions: 1
Joining: Immediate

Key Responsibilities & Duties:
  • Work as part of a team to develop Cloud Data and Analytics solutions
  • Participate in development of cloud data warehouses, data as a service, business intelligence solutions.
  • Ability to provide solutions that are forward-thinking in data and analytics
  • Developing Modern Data Warehouse solutions.
  • Design, develop and build data integration pipeline architecture and ensure successful creation of the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Spark, SQL and other technologies.
  • Expertise in writing complex, highly optimized queries across large datasets
  • Code deployments, Code Reviews
  • Develop and maintain design documentation, test cases, performance and monitoring and performance evaluation using GIT, Confluence, Cluster Manager
Key Skills & Requirements:
  • 4 – 5 years of experience in a diversified IT and minimum 3+ years of relevant experience in Databricks.
  • Hands-on experience in Python/Pyspark/Scala/Hive Programming to develop ETL pipelines.
  • Working experience in ETL testing including Master Data Management, Data Completeness Data Cataloging, Data Leverage, and Data Quality for various data feeds coming from the source.
  • Hands-on experience in creating Stored Procedures, Functions, Tables, Cursors, and Hashtables and handling large data types.
  • Experience in database testing, data comparison & data transformation scripting.
  • Good knowledge of Data Warehousing concepts & Data Modelling.
  • Should have experience working on complex data architecture.
  • Communicate progress and other relevant information the project stakeholders.
  • Proficient in a source code control system such as GIT.
  • Experience with building CI/CD pipelines in Data environments.
  • Excellent communication and documentation skills.
  • Should have worked in the onsite/offshore model earlier.

Add Ons: Any AWS/AZURE and Databricks Certifications

Kindly share your CVs at talent.datalensai.com

JOB DESCRIPTION:

Employment Type: Full Time, Permanent
Location: Hyderabad, Bangalore, Chennai. Can consider remote based on the requirement
No. of Positions: 1
Joining: Immediate

 

Key Responsibilities & Duties:
  • Work as part of a team to develop Cloud Data and Analytics solutions
  • Participate in development of cloud data warehouses, data as a service, business intelligence solutions.
  • Ability to provide solutions that are forward-thinking in data and analytics
  • Developing Modern Data Warehouse solutions.
  • Design, develop and build data integration pipeline architecture and ensure successful creation of the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Spark, SQL and other technologies.
  • Expertise in writing complex, highly optimized queries across large datasets
  • Code deployments, Code Reviews
  • Develop and maintain design documentation, test cases, performance and monitoring and performance evaluation using GIT, Confluence, Cluster Manager
Key Skills & Requirements:
  • 4 – 5 years of experience in a diversified IT and minimum 3+ years of relevant experience in Databricks.
  • Preferred Certification in Azure Databricks.
  • Minimum of 2 years of hands-on experience in data migration.
  • Minimum of 2 years of experience working with Azure Databricks.
  • Minimum of 2 years of proficiency in MS SQL, Python, and PySpark.
  • Nice to have: Previous experience in the insurance or finance domain.

Add Ons: Any AWS/AZURE and Databricks Certifications

Kindly share your CVs at talent.datalensai.com

JOB DESCRIPTION:

Employment Type: Full Time, Permanent
Location: Hyderabad, Bangalore, Chennai Can consider remote based on the requirement
No. of Positions: 1
Joining: Immediate

Key Responsibilities & Duties:
  • Recognize business requirements in the context of BI and create data models to transform raw data into relevant insights.
  • Using Power BI, create dashboards and interactive visual reports.
  • Define key performance indicators (KPIs) with specific objectives and track them regularly.
  • Analyse data and display it in reports to aid decision-making.
  • Convert business needs into technical specifications and establish a timetable for job completion.
  • Create, test, and deploy Power BI scripts, as well as execute efficient deep analysis.
  • Use Power BI to run DAX queries and functions.
  • Create charts and data documentation with explanations of algorithms, parameters, models, and relationships.
  • Construct a data warehouse.
  • Use SQL queries to get the best results.
  • Make technological adjustments to current BI systems to improve their performance.
  • For a better understanding of the data, use filters and visualizations.
  • Analyze current ETL procedures to define and create new systems.
Key Skills & Requirements:
  • Background with BI tools and systems such as Power BI, Tableau.
  • Prior experience in data-related tasks.
  • Be familiar with MS SQL Server BI Stack tools and technologies, such as SSRS and TSQL, Power Query, MDX, PowerBI, DAX, Azure Data Factory and Azure Datalake.
  • Understanding of the Microsoft BI Stack.
  • Mastery in data analytics
  • Should be proficient in software development.
  • Analytical thinking for converting data into relevant reports and graphics.
  • Should have experience working on complex data architecture.
  • Capable of enabling row-level data security.
  • Knowledge of Power BI application security layer models.
  • Ability to run DAX queries on Power BI desktop.
  • Proficient in doing advanced-level computations on the data set.
  • Excellent communication skills are required to communicate needs with client and internal teams successfully.

 

Kindly share your CVs at talent.datalensai.com

Ready to Discuss Your Next Project?