A World Leading Energy Company has an exciting contract opportunity for an Azure Database Administrator who has ideally worked within Commodity Trading environments, to be a part of one of the largest energy trading operations in the world with key hubs in Singapore, Dubai, Houston, London, and Rotterdam, trading in crude oil, natural gas, LNG, electrical power, refined products, chemical feedstocks and environmental products.
This position is to support the IT delivery team to maintain a large, complex Azure-based data platform and assist teams with day to day data performance challenges.
About the role – as an expert in Azure data storage, you will:
• Ensure the ongoing performance and reliability of several large data warehouses and Azure data storage solutions, including Synapse, SQL, Cosmos, Redis, blob, ADLS and other storage
• Have a deep understanding of most of the storage options in Azure, and are able to provide detailed performance and configuration optimisation on them, including the impact on cost
• Support multiple teams with complex problems which others have been unable to solve – many of the requests you get will be from teams where something’s gone wrong, and being able to fix the issue, and explain to them how and why the fix works.
• Drive to delivery – decompose work to a set of concrete steps, and manage delivery against those steps, ensuring progress towards the end goal.
About you / Essential experience
• Azure Storage – Excellent hands on practical experience of Microsoft Azure storage options including Synapse, SQL, Cosmos, blob, ADL, Redis, ELK, queues, tables etc. developed in complex, real world projects, including evaluating cost/performance trade-offs
• Highly proficient DBA – optimisation of indexes, partitioning tables, queries and execution plans, permissions, performance, encryption, replication etc., including analysing database usage (which tables were accessed etc.) and proactive monitoring
• Data warehousing – you’ll know your star from your snowflake, when to use each, and when to use neither
• Time series data – we generate and consume massive amounts of time series data. You’ll have hands on experience working with this at scale while maintaining performance.
• Strong Transact SQL skills – you’ll know when and when not to use linked servers, cursors, common table expressions, and how to document a stored procedure
• CI/CD – hands on experience with automated deployment using GitHub actions and Azure DevOps and configuration management of both infrastructure (e.g. ARM, terraform etc.), and database & schema (e.g. dacpak, bakpak etc.)
• Data Engineering – understands data engineering tooling such as databricks/Spark, ADF, Apache Airflow, and associated consumers and generators such as Azure Functions, Web Apps, and distributed / event-driven architecture (event hub, event bus, kafka etc.), ELK, etc.
• Agile working – Practical experience of working effectively in an iterative/Agile development team (including sprint planning, daily scrums, reviews and retrospectives)
• Experience in a commodity trading organisation, with an understanding of the types of data typically used in our operations
• Experience in toolsets for managing requirements, for example, Azure DevOps, SharePoint, etc.
• Professional accreditation e.g. Foundation in Business Analysis or ISEB Diploma in Business Analysis, Professional Scrum Master I (Fundamental) Assessment from scrum.org.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.