Hardship Level (not applicable for home-based)H (no hardship)

Family Type (not applicable for home-based)

Family

Staff Member / Affiliate TypeUNOPS LICA9

Target Start Date2023-06-01

Job Posting End DateJune 11, 2023

Terms of ReferenceTitle: Associate Cloud Data Engineer
Location: Home-based
Section/Unit: PSP/Supporter Engagement Section
ICA Level: UNOPS LICA-9 (equivalent to NO-B level)
Duration: from 01/07/2023 to 31/12/2023 (possibility of extension)

1. General Background

This position is with the Data and Analytics function in UNHCRs Supporter Engagement Section (SES) within the Private Sector Partnerships Service (private fundraising) and serves departments within Division of External Relations (DER), PSP and the wider UNHCR organization.

The Data and Analytics function supports UNHCR fundraising activities and works in the field of Digital Marketing and CRM tools. We operate systems for collecting marketing data and storing it in Google Cloud BigQuery. For bringing in new data and orchestrating it we use tools like: Cloud Data Fusion, Stitch, Skyvia, Apache Airflow.

As Associate Cloud Data Engineer you will work with key stakeholders at UNHCR HQ and in the regional offices to advise, develop and implement a best practice analysis and marketing data warehouse in Google Cloud. You will be UNHCRs all-round expert in Google Cloud Platform products.

Specific attention will be given to ensuring continued value from the UNHCR Data Lake for Fundraising data in GCP Big Query.

This will include:
• Continued onboarding of more regional teams
• Adding more data sources and pipelines
• Designing and implementing data marts for data analysis and reporting officers.
• Introductory training in BigQuery
• Management and monitoring of pipelines
• Governance. Managing users access and writing user guidelines.
• Data management
• Working with UNHCR ICT security team

2. Purpose and Scope of Assignment

Under the overall supervision of the PSP Data and Analytics Officer, the individual contractor will assist in the following:

• Designing and implementing data models and data pipelines for digital analytics, digital ads optimization, donations and CRM engagement
• Optimizing and operating a commercial data warehouse based on the architecture of Google Cloud BigQuery (Managed serverless data warehouse) and related GCP products.
• Define solutions, requirements and processes to make data universally available to the analyst teams and reporting teams who delivers dashboards in Microsoft Power BI
• Operate and design data models for data in raw, curated and normalized format to meet data consumer's needs (data analysis and reporting officers)
• Assist other teams and stakeholders in best use cases of the data to make their operations faster
• Keep an eye out for changes in source system API changes and identify potentials, by being up to date with the release notes of these vendors
• Monitor and identify potentials for refactoring and improvements in the data pipelines
• Coordinate global network of internal clients across fundraising, advocacy and other programs, ensuring a standardized data convention
• Work closely with the different teams to bring value from data and help them create corresponding tasks from data and insights
• Help promoting a data-first based strategy and identify business opportunities from analyzing data
• Coordinate UNHCRs community of BigQuery data warehouse users.
• Be an advocate for the generally available dashboards in Power BI and work closely with the team that create and manage these dashboards
• Data management. Document data models and maintain data dictionaries
• Training colleagues in BigQuery and data models understanding.
• Working with UNHCR ICT security team to ensure security standards are met

3. Qualifications and experience

a. Education

• University degree in Computer Science, Engineering, Mathematics, Statistics, Business Administration or Marketing, or a related field - computer science and digital marketing qualifications preferred.
• Google Cloud Data Engineer Certification is preferred, but not a must.
• Google Cloud BigQuery or another NoSQL database completed training course is preferred, but not a must.
• SQL completed training course is preferred

b. Work Experience

• 3 years relevant experience with bachelor’s degree; or 2 years with master’s or equivalent or higher degree
• Experience with designing, building and maintaining data warehousing solutions using GCP technologies such as BigQuery, Cloud Storage, Cloud Data Fusion, Apache Airflow, Cloud Composer, Pub/Sub. Or AWS or Azure equivalent products.
• Experience with SQL
• Python development experience with API and ETL integration tasks
• Experience with ETL processes and tools (e.g. Cloud Data Fusion, Stitch, Skyvia or similar tools)
• Experience in setting up data pipelines and automating data workflows
• Familiarity with digital marketing datasets such as Google Analytics, Google Ads, Meta Ads Familiarity with CRM systems such as Salesforce and reverse ETL for Salesforce and Marketing automation tools and ad platforms to make data actionable.
• Familiarity with cloud security best practices including access control and monitoring and data privacy regulations like GDPR and CCPA.
• Experience with data visualization tools such as MS PowerBi or Looker is desirable
• Worked with GitHub or similar tool

c. Key Competencies

• Strong understanding of data modeling, ETL processes, and data warehousing concepts.
• Having a structural and analytical mindset in explaining rather complex data models to other teams
• Being up to date with changes in marketing data sources and potential new data sources (Salesforce Sales Cloud, Google Analytics, Google Ads, Meta Ads)
• Project management
• Good communication skills for communicating technical topics clearly
• Strong interpersonal skills and ability to establish and maintain effective working relationships with people in a multicultural, multi-ethnic environment with sensitivity and respect for diversity
• Fluency in spoken and written English is required
• Knowledge of another UN language is a plus (Arabic, Chinese, French, Russian, Spanish).

4. Location and Conditions

This is a remote working position and the successful candidate will be home-based.

This position is initially up to the end of the year with possibility of extension subject to budget availability and overall performance of the incumbent. It is a full-time role starting from 8.30am to 5pm Monday to Friday (40 hours per week).

The remuneration level and the applicable entitlements and benefits may be different based on the residence of the most suitable selected candidate.

Recruitment as a UNHCR staff and engagement under a UNHCR affiliates scheme or as an intern is subject to proof of vaccination against Covid-19.

Standard Job Description

Required Languages

,

Desired Languages

,

Additional Qualifications

Skills

EducationBA (Required)

Certifications

Work Experience

Other informationThis position doesn't require a functional clearance

Recommended for you