Data Architect (1087GO)
Who we are:
Sceel.io is a German software services provider Part of Sigma Technology Group, working across technology consulting, team augmentation, and product development with operations in Ukraine and Egypt.
Since 2018 and with our headquarter in the heart of the automobile industry, Stuttgart Germany, our talented professionals have delivered +20 successful projects across native mobile, web, desktop, and hybrid development and quality assurance, and we are open for much more!
Our teams include +25 talented Developers & QA that come from a variety of backgrounds. We’re Keen on building an inclusive culture based on trust and innovation and value every skill set while maintaining a family-like environment where everyone is heard and appreciated.
So, if you are searching for a new challenging role where you can use all of your Software development skills and work closely with International Organizations and Up to date Technologies, We are Looking For You!
o Develop database solutions to store and retrieve company information
o Install and configure information systems to ensure functionality
o Analyze structural requirements for new software and applications
o Migrate data from legacy systems to new solutions
o Design conceptual and logical data models and flowcharts
o Improve system performance by conducting tests, troubleshooting and integrating new elements
o Optimize new and current database systems
o Define security and backup procedures
o Coordinate with the Data Science department to identify future needs and requirements
o Provide operational support for Management Information Systems (MIS)Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning).
o Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models o Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models.
o Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization.
o Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC.
o Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks.
o Analyze and organize raw data
o Evaluate business needs and objectives
o Interpret trends and patterns
o Conduct complex data analysis and report on results
o Explore ways to enhance data quality and reliability
o Identify opportunities for data acquisition
o Collaborate with data scientists and architects on several projects
o Using Extract Transform Load operations (the ETL / ELT process)
o Researching new methods of obtaining valuable data and improving its quality
o Creating structured data solutions using various programming languages and tools
o Mining data from multiple areas to construct efficient business models
o Collaborating with data analysts, data scientists, and other teams.
o Degree in Computer Science, IT, or similar field; a Master’s is a plus in computer/data science technical or related experience
o Data Architecture certification is a plus.
o Proven work experience as a Data Architect, Data Scientist, Data Analyst or similar role
o In-depth understanding of database structure principles
o Experience gathering and analyzing system requirements
o Knowledge of data mining and segmentation techniques
o Expertise in SQL and Oracle o Proficiency in MS Excel
o Familiarity with data visualization tools (e.g. Tableau, D3.js and R)
o Proven analytical skills
o Problem-solving attitude
o 1+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols).
o Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required.
o Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required.
o Experience in team management, communication, and presentation
o experience with SQL, ODI, and ETL/ELT.
o Experience building or maintaining ETL processes
o Communication skills, especially explaining technical concepts to non-technical business leaders.
o Technical Knowledge/Skills
o Strong experience with advanced analytics tools for Object-oriented/object function scripting using languages such as [R, Python, Java, KNIME, others].
o Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata, and workload management. The ability to work with both IT and business in integrating analytics and data science output into business processes and workflows.
o Strong experience with popular database programming languages including [SQL, PL/SQL, others] for relational databases and certifications on upcoming [NoSQL/Hadoop oriented databases like MongoDB, Cassandra, others] for nonrelational databases.
o Strong experience working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures, and integrated datasets using traditional data integration technologies. These should include [ETL/ELT, data replication/CDC, message-oriented data movement, API design, and access] and upcoming data ingestion and integration technologies such as [stream data integration, CEP, and data virtualization].
o Strong experience working with and optimizing existing ETL processes and data integration and data preparation flow and helping to move them in production.
o Basic experience working with popular data discovery, analytics, and BI software tools like Tableau for semantic-layer-based data discovery.
o Demonstrated success in working with large, heterogeneous datasets to extract business value using popular data preparation tools.
o Adept in agile methodologies and capable of applying DevOps and increasingly DataOps principles to data pipelines to improve the communication, integration, reuse, and automation of data flows between data managers and consumers across an organization.
o Technical expertise with data models, data mining, and segmentation techniques
o Hands-on experience with SQL database design
o ETL and Data Lakehouse
o Knowledge of Informatica
o Knowledge of Cloudera
o Data mining and modeling
Work Model: Hybrid (2 days from office-3 from home)
Working Hours: Flexible
Fridays and Saturdays are off
Perks and Benefits:
A basic monthly net salary determined by the experience
Social and Medical Insurance
Salary reviews after Trial Period based on performance
Unlimited drinks at the office
An amazing accessible office in Maadi with a Relaxing Outdoor Garden.
Career and skills development
Last but not least an AMAZING team!!