Job Description:
You will assist in various projects to support the growth of the US&CA data asset portfolio. You will support different team experts in the development of new tools and/or the maintenance of current proprietary databases, primarily through structuring and managing databases, developing data workflows and processes, and building visualizations and dashboards.
One of your core responsibilities will be building data assets using R, Python and BI tools, including Tableau, PowerBI, and Domo, etc., to generate new insights using data.
You may be asked to facilitate ingestion of large data through online and offline sources, and track and analyze the raw usage metrics of our asset portfolio. You will mine this data to help us better understand user behaviors, proposing changes ("interventions") to improve product adoption and satisfaction
Candidate Requirements:
* Last year pursuing a degree related to data engineering, computer science, information technology, data science or data analytics
* Strong foundational knowledge of databases, data structures, and SQL
* Familiarity with programming languages such as R and Python, Java, or Scala
* Basic understanding of data warehousing concepts and ETL processes
* Good problem-solving skills and attention to detail
* Excellent communication and teamwork abilities
* A willingness to learn and a proactive attitude toward professional development
* Experience with cloud platforms like Snowflake, AWS, Azure, or GCP is a plus
* Exposure to big data technologies such as Hadoop, Spark, or Kafka is a plus
Source: | Company website |
Posted on: | 19 Oct 2024 |
Type of job: | Internship |
Industry: | Consulting |
Languages: | English |
Recruiters |
Top Jobs |
Countries |