ORS Partners

Senior DevOps Data Engineer

Job Locations US-Radnor
Posted Date 2 months ago(10/19/2020 3:47 PM)
Client & Job Location : Address
201 King of Prussia Rd
Client & Job Location : Postal Code
Job ID
# of Openings
Information Technology
Regular Full-Time


Position Accountability

Design, build, operate, and monitor our cloud data pipeline assets using the latest tools and techniques to enable rapid evaluation, deployment, and tuning of schema and data changes. Ensure the data pipeline infrastructure meets non-functional requirements such as scalability, performance, and reliability through effective architecture design and directing performance engineering efforts. Oversee cloud based relational databases and non-relational, time series data collections. Contribute to cross-team efforts requiring compiling data from telemetry, application logging, and application data to identify and investigate anomalous activity. This is a key role on the DevOps team that serves to connect our innovative digital banking product to the distinct business units with reliable, high-quality, insightful data.


Expected Outcomes

  • Cloud data pipelines are developed and supported in an efficient fashion to ensure large, complex data sets are effectively maintained and continuously augmented
  • Manual processes around data delivery are identified so that automation can be designed and implemented for greater scalability and organizational efficiency
  • Infrastructure is configured for optimal extraction, transform and loading of data from a wide variety of external sources
  • Cloud-based analytics tools will make extensive use of our highly reliable data infrastructure to provide actionable insights into customer acquisition, product performance, fraud detection and other key business performance metrics
  • Knowledge sharing is performed across all technology teams to increase business alignment, quality and efficiency
  • Partners are satisfied with our level of professionalism and technical expertise
  • Identify and deliver improvements to the SDLC in general - and the operations experience in particular - to ensure that the status quo is safe, high-quality, rapid releases.
  • Document your work so that others can understand it.


Position Responsibilities

  • This is a hands-on systems engineering role that focuses on building and deploying data pipelines within our Azure environment with the express purpose of organizing our complex data effectively while ensuring data integrity and security
  • Partner with the Data Architect to meet the functional needs of business operations, business product management, and disparate technical teams by delivering best-in-class data solutions
  • Prepare, package, coordinate and implement production releases and fixes.
  • Expert in Troubleshooting, Optimization and Performance tuning of SQL Server database server (Ex., Query Tuning, Server Tuning, Disk Performance Monitoring, Memory Pressure, CPU bottleneck etc.)
  • DDL/DML expertise, Optimizing and Performance Tuning using execution plans, Profiler and Database Tuning adviser.
  • Troubleshoot application database connectivity, Azure SQL issues, query performance, disaster recovery and other issues in production and development.
  • Experience in implementing High Availability & Disaster Recovery solutions using Always ON, Clustering, Replication, Mirroring and Log Shipping.
  • Collaborate with business units to help define our data analytics strategy with the goal of building and driving adoption of a new unified business intelligence platform
  • Continuously monitor our business for new opportunities to automate inefficient internal processes around data acquisition and availability
  • Identify, assess, track and mitigate issues and risks at multiple levels within the organization
  • Expert in Cloud-native computing (IaaS, PaaS, FaaS).
  • Expert in DevOps practices and modern CI/CD tools and deployment models.
  • Experience with Azure, or related cloud provider.
  • Experience with SQL Server.
  • Experience with PowerShell.
  • Experience with C#/.NET.
  • Experience operating Multi-Tenant SaaS systems.


Experience Required

  • Must have 5+ years of experience building and optimizing data pipelines, architectures and data sets within a cloud computing environment
  • Technical experience with the following: MSSQL Server, Azure SQL, Microsoft Azure, Azure DevOps, Production Principal duties
  • Cloud experience: Microsoft Azure Cloud, Azure SQL Database including installation, configuration, backup/restore, security, AD groups, managing resource groups, storage blobs.
  • Must have advanced working SQL knowledge and experience working with both relational databases in a query authoring capacity as well as big data technologies such as No SQL and unstructured data store
  • Must have expertise in DataOps techniques and modern cloud computing technologies, most importantly deep familiarity with the Azure stack
  • Previous work experience partnering with data analytics teams to increase the effectiveness of internal business intelligence systems preferred
  • Strong attention to detail including precise and effective customer communications and proven ability to manage multiple, competing priorities simultaneously
  • Superior verbal and written communications skills and history of excellent team collaboration
  • Provided 24x7 production support for SQL Server databases on standalone and Clustered Servers with sizes ranging from 100s of GB to 20+ TB to achieve maximum performance and uptime in production environment.
  • Experience working as an Azure DevOps Automation Engineer for database deployment
  • Establish best practices for databases, and ARM templates implemented for other environments.
  • Create linked ARM templates and integrate with the Azure DevOps pipeline Hands-on, on the ground database guru who does the actual work Tech advisor and site engineer and point person for anything database.
  • S. in Computer Science, Statistics, Informatics, Information Systems or other quantitative field preferred


Experience with the following is a PLUS:

  • Automation experience, ideation, creation, and execution of database automation – Terraform & Ansible.
  • Experience in Installation, Configuration, Maintenance and Administration of Azure Data Explorer (ADX)
  • Production support in a financial service setting.
  • Front-line leader that can act as a player/coach that can provide lift to the engineering organization.


About BankMobile:

Established in 2015, BankMobile is a division of Customers Bank and America’s largest and fastest growing mobile-first bank offering checking and savings accounts. BankMobile provides an alternative banking experience to the traditional model and is focused on technology, innovation, easy-to-use products and education with the mission of being “customer-obsessed” and creating “customers for life.” The disruptive, multi-partner distribution model, known as “Bank-as-a-Service,” created by the executive team enables BankMobile to acquire customers at higher volumes and substantially lower expense than traditional banks. Its low-cost operating model enables it to provide low-cost banking services to low/middle-income Americans who have been left behind by the high-fee model of “traditional” banks. Today, BankMobile provides its “Bank-as-a-Service” platform to colleges and universities and currently serves nearly two million account-holders at 800 campuses (covering one out of every three students in the U.S.). BankMobile is operating as the digital banking division of Customers Bank, which is a Federal Reserve-regulated and FDIC-insured commercial bank.


For more information, please visit www.bankmobile.com. BankMobile, a division of Customers Bank, will provide consideration for employment to qualified applicants without regard to their race, color, religion, national origin, sex, protected veteran status or disability. BankMobile, a division of Customers Bank. Member FDIC - Equal Housing Lender - All Rights Reserved



Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed