Director of Data Engineering
Job Purpose
The Director of Data Engineering serves as the strategic and technical lead for data architecture and pipeline development within the College of Charleston’s Division of Enrollment Planning. This role is responsible for designing, implementing, and maintaining scalable, automated, and standardized data workflows that improve the accuracy, efficiency, and accessibility of undergraduate enrollment data. Key responsibilities include developing robust data pipelines, orchestrating complex workflows, managing cross-functional data projects, and producing clear, comprehensive process documentation. The Director works collaboratively across multiple offices within the Division of Enrollment Planning and partner offices across campus to ensure alignment with institutional data governance standards and technology infrastructure. This position plays a critical role in enabling data-informed decision-making by ensuring that high-quality data is readily available to stakeholders across the institution.
Minimum Requirements
A bachelor’s degree in computer science, information systems, or a related field, along with relevant experience in one or more of the following areas: systems development and maintenance, technical support, complex database management, computer programming, software application development, workflow automation, project management, and/or systems analysis. Experience working in a higher education environment is preferred. Candidates with an equivalent combination of education and professional experience are encouraged to apply.
Required Knowledge, Skills and Abilities
Required
- Strong understanding of relational database architecture, data warehousing, data engineering principles, system integrations, and cloud/web applications
- Experience designing and developing data workflows using middleware platforms such as Informatica or Axiom Elite
- Familiarity with project management concepts and agile methodologies
- Proficiency in one or more programming languages
- Skilled in Microsoft 365 tools, including Teams, SharePoint, Excel, PowerPoint, Copilot, and Power Automate
- Aptitude for effectively leveraging generative AI platforms to streamline tasks, enhance analysis, and support productivity
- Exceptional analytical and problem-solving skills with a passion for logic development
- High attention to detail and a strong commitment to data accuracy, integrity, and security
- Ability to identify and implement opportunities for improving data and business process efficiency
- Eagerness to learn, adopt, and share new tools and methodologies
- Proactive and adaptable in a fast-paced environment with strong prioritization skills
- Clear, professional communication skills for effective collaboration with technical and non-technical stakeholders
- Self-driven with a continuous improvement mindset
Preferred
- Experience with tools and platforms such as Salesforce Lightning, Informatica IICS, Ellucian Banner, Axiom Elite, TargetX CRM, Validity DemandTools, Apex, Visualforce, SQL, SOQL, Python, R, Java, Git¸ Asana
- Hands-on experience developing within CRM systems (e.g., Salesforce) and middleware applications (e.g., Informatica, Axiom Elite)
- Familiarity with admissions and higher education business processes and best practices
- Relevant certifications or digital badges in software, data engineering, or cloud technologies
- Additional Comments Regarding Position
- *Depending upon experience level, this position can accommodate preferences for remote, hybrid remote, or on-campus work schedules.
Special Instructions to Applicants
A resume is required with application submission. Submission of a cover letter is strongly encouraged. Submission of college transcript(s) showing relevant coursework is encouraged.
Please complete the application to include all current and previous work history and education. A resume will not be accepted nor reviewed to determine if an applicant has met the qualifications for the position.
**Salary is commensurate with education/experience which exceeds the minimum requirements.
Offers of employment are contingent upon a successful background check.
Salary **$68,801 - $80,000
Posting Date 06/17/2025
Closing Date 07/03/2025
Job Duties
Data pipeline development
Leads the design and execution of scalable data pipeline projects to ensure efficient, accurate data transmission across systems and databases. Defines functional requirements and architects robust data processing workflows within databases and middleware platforms, aligning with business and technical needs. Utilizes workflow visualization tools to document and communicate pipeline logic, data mappings, and transformation rules. Oversees the scheduling, monitoring, and optimization of automated ETL/ELT jobs, enhancing performance within database capacity and API constraints. Provides strategic input on SFTP configuration and ongoing maintenance to support secure data exchange. Acts as the functional administrator for Informatica IICS and Axiom Elite environments.
Essential or Marginal Essential
Percent of Time 60
Workflow orchestration
Executes and manages a combination of manual and semi-automated ETL/ELT workflows to ingest diverse data types into CRM and SIS platforms; file types include prospective student records, standardized test scores, and application files. Partners with operational teams to validate data accuracy and ensure seamless integration for end users across multiple campus departments. Develops robust error-checking protocols and monitoring reports to uphold high standards of data integrity and reliability.
Essential or Marginal Essential
Percent of Time 25
Project management
Lead and coordinate data engineering initiatives by defining project scope, timelines, and deliverables in close collaboration with cross-functional teams. Drive project execution using agile methodologies to optimize resource allocation, monitor progress, and proactively mitigate risks. Effectively communicate project status and technical challenges to team members and stakeholders to support strategic prioritization across the data engineering portfolio.
Essential or Marginal Essential
Percent of Time 10
Process documentation
Develops and maintains data and business process documentation in both written and visual formats to ensure operational continuity and knowledge transfer. Ensures documentation remains current, accurate, and aligned with evolving workflows, translating complex technical procedures into accessible formats for non-technical audiences. Proactively engages colleagues for feedback to validate clarity, usability, and effectiveness of documentation, fostering a culture of shared understanding and continuous improvement.
Essential or Marginal Essential
Percent of Time 5
How to Apply
Please submit applications through the school's career website.