15Nov

Data Intelligence Manager at Publicis Groupe – Makati, Philippines


Company Description

Publicis Groupe is not just a company you work for; it’s a platform for you to take your talent to the world.

If you want to help change the world, ideas alone are not enough. Real impact can only come from having meaningful access to a world of knowledge, people and resources. At Publicis Groupe, you are connected to our global network, intelligence, tools, clients, brands and 80,000 brilliant minds with expertise in data, technology, media, strategy, creativity and business transformation, all literally at your fingertips.

Go ahead, the world is waiting.

Publicis Groupe is the third largest communications group in the world. Founded in Paris in 1926, we are present in more than 100 countries as leaders in marketing, communication, and digital business transformation. Two of its biggest solution hubs in Singapore – Publicis Communications and Publicis Media & Digital.

Publicis Communications, the creative communications hub of the Publicis Groupe, is a collective of the most passionate, purposeful, and progressive creative agencies in Singapore. They are Publicis Worldwide, Leo Burnett, Saatchi & Saatchi, Prodigious, and MSL.

Publicis Media & Digital, which is comprised of global media agency brands Starcom, Zenith, Spark Foundry, and Performics, is powered by digital-first, data-driven global practices that together, help our clients navigate the modern media landscape.

Our two other solution hubs, Publicis Sapient and Publicis Commerce, empower businesses to embrace digital transformation and equip them with a total commerce experience.

Job Description

The Research and Analytics Manager is part of the Publicis Media Data Intelligence team, responsible for supporting integrated data, research, reporting and analytics requirements through proprietary and syndicated products and solutions.  The thrust is to enhance operational efficiency and scale solutions, further bringing incremental value and growth to Publicis clients.

 

Primary Responsibilities:

  • Solve complex business challenges through advanced analytics such as Market Mix Modelling, Attribution Studies, Predictive Modelling, etc
  • Collaborate with media teams to understand needs and application of advanced analytical methods to enhance their strategies and campaigns
  • Manage analytics project stakeholders through requirements gathering, solution design, planning, implementation and reporting/presentation
  • Project management for research and analytics client requirements
  • Be the liaison for research projects, maintaining the relationship with our research partners
  • Develop and drive innovation for advanced analytics products and solutions through building proof-of-concepts models and being at the forefront of new tech, best practices, etc
  • Drive adoption of advanced analytics products and solutions across the agency
  • Oversee and mentor the senior research analyst in ensuring accurate and quality output

Qualifications

Qualifications and desired experience:

  • Strong analytical and critical-thinking skills with a data-driven approach to problem solving
  • Experience in R, Python and other programming languages a must
  • Strong knowledge and experience in statistical, machine-learning techniques and conducting quantitative analysis, regression analysis, predictive modelling, brand lift solutions etc
  • Well-versed with quantitative research – understanding of research designs and methodologies
  • Ability to analyze large and complex datasets and translate it to clear and compelling narratives
  • Knowledge of media advertising and the marketing industry highly preferred
  • Experience in various marketing analytics models such as Market Mix Modeling, Multi-touch attribution, highly preferred
  • Advanced Excel skills and expertise in business intelligence/data visualization platforms a plus
  • Strong presentation skills with ability to communicate and simplify complex ideas to both internal and external stakeholders and senior level clients and management
  • Demonstrated ability on project management, ensuring timely and accurate delivery of outputs

Additional Information

Our employee benefits package comprises the following:

👉 Generous leave entitlements, including birthday leave and emergency leave.

👉 Additional Day Off on Mental Wellness Month.

👉 Insurances on Day 1 plus free dependent

👉 A hybrid working schedule and with Friday disconnects.

👉 Rest Relax & Recharge – office closure around last week of December every year.



Source link

14Nov

Sr Associate, Data Engineer at Pfizer – USA – NY – Headquarters


ROLE SUMMARY

Pfizer has established a chief digital office which will lead the transformation of Pfizer into a digital powerhouse that will generate patient superior patient experiences that will result in better health outcomes.

The Enterprise Data and Analytics team, which is part of AI & Data Organization within the Chief Digital Office is responsible for the development and management of all data and analytic tools and platforms across the enterprise – from manufacturing to global product development and for all Pfizer countries. One of the team’s top priorities is the development of an enterprise data lake which will serve as the engine for the company’s digital transformation. 

Pfizer is seeking a Data Engineer to help realize Pfizer Digital strategy on the cloud by designing and delivering data products. Data Engineer will design, architect and build upon our PGS (Pfizer Global Supply) Data Lake to develop value added data products that span the areas of Supply Chain Execution, Planning, Manufacturing, Quality, Logistics etc.

The Supply Chain Data Engineer will be hands on and have abilities to work with a group of talented engineers and partners with Digital Partner(s), Enterprise Architecture, and Business Unit stakeholders to develop and sustain the PGS data strategy and target data architecture. Data engineers also partner with solution development teams to ensure use case delivery goals while adhering to data architecture principles, guidelines, and standards.

ROLE RESPONSIBILITIES

  • Reporting to the Data Management Sr.Manager, the Sr.Associate Data Engineer will be responsible for data modeling and engineering with advanced data platforms team to drive digital outcomes.

  • Conceive, design, and implement Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs 

  • Develop complex data products that are beneficial for PGS and allow for reusability across enterprise.

  • Ability to work with a team of contractors that will deliver technical enhancements

  • Design automated solutions for building, testing, monitoring, and deploying ETL data pipelines in a continuous integration environment

  • Develop internal APIs and data solutions to power applications and promote connectivity

  • Expand Pfizer’s portfolio of capabilities through coherent APIs for internal users and third parties.

  • Develop new systems and tools to enable the teams to consume and understand data more intuitively

  • Coordinate with backend engineering team to analyze data in order to improve the quality and consistency of our data

  • Perform root cause analysis and resolve production and data issues

  • Create test plans, test scripts and perform data validation

  • Fine Tune SQL queries, programs, reports and ETL pipelines

  • Help innovate by building POC’s and Pilot’s identifying new approaches improving efficiency and simplicity

  • Build and maintain data dictionary and process documentation

  • Ability to innovate and present solutions to leadership, management, architects and developers.

  • Provide hands-on technical and thought leadership across development, product management, operations, and engineering management through effective communication. Actively develop mentoring relationships within or across teams to help others further their careers

Professional Experience and Educational Requirements

  • Bachelor’s degree in Computer Science, Engineering or related field with three years of relevant experience; OR Master’s degree with one year of relevant experience; OR Associate’s degree with six years of relevant experience; OR Eight years of relevant experience with a high school diploma or equivalent.

  • 3+ years of experience as a Data Engineer / 3+ years of experience as a Data Architect (OR) 3+ years of experience Data warehousing, data modeling, and data transformation (OR) 3+ years of experience designing complex and inter – dependent data models for analytic use cases. 3 or more years of experience with one or more general purpose programming languages, including but not limited to:

  • Experience in Python, SQL, Java, Scala, Pyspark, C, C++, C#, Swift/Objective, or JavaScript

  • Experienced with data preparation and ETL: Airflow, DBT, Fivetran, Kafka, Informatica, Talend, Alteryx, etc

  • Experience with Software engineering best-practices, including but not limited to version control (Git, TFS, Subversion, etc.), CI/CD (Jenkins, Maven, Gradle, etc.), automated unit testing, Dev Ops.

  • Understanding of Cloud platforms like snowflake, databricks, S3, Redshift, bigquery etc

  • Architected end-to-end data pipelines with a cloud or on prem stack

  • Prior experience as a data modeler is a must

  • Knowledge of Cloud computing, machine learning, text analysis, NLP & Web development experience is a plus

  • Knowledge of Ontologies and Graph Databases (Neo4j, Titan, etc.) and query syntax is a plus

  • Prior experience working for Biotech / Pharma companies is a plus

Professional and Leadership Characteristics

  • Strategic Thinker: Understands the overall context, business implications and is able to think through a problem both conceptually and logically for better solutions

  • Creative: Able to bring forth new ideas to improve our existing practices and takes calculated risks to innovate new capabilities within Pfizer Digital Business Analytics, with a focus on data products and analytics solutions

  • Analytical Thinker: Understands how to synthesize facts and information from varied data sources, both new and pre-existing, into discernable insights and perspectives; takes a problem-solving approach by connecting analytical thinking with an understanding of business drivers and how BA can provide value to the organization

  • Adaptable: Demonstrates flexibility in the face of shifting targets, thrives in new situations

  • Pioneering: Pushes self and others to think about new innovation and digital frontiers and ways to conquer them

  • Ambiguity Tolerant: Successfully navigates ambiguity to keep the organization on target and deliver against established timelines

  • Strong Data and Information Manager: Understands and uses analytical skills/tools to produce data in a clean, organized way to drive objective insights 

  • Exceptional Communicator: Can understand, translate, and distill the complex, technical findings of the team into commentary that facilitates effective decision making by senior leaders; can readily align interpersonal style with the individual needs of customers

  • Highly Collaborative: Manages projects with and through others; shares responsibility and credit; develops self and others through teamwork; comfortable providing guidance and sharing expertise with others to help them develop their skills and perform at their best; helps others take appropriate risks; communicates frequently with team members earning respect and trust of the team  

  • Proactive Self-Starter: Takes an active role in one’s own professional development; stays abreast of analytical trends, and cutting-edge applications of data

  • Creative: Able to bring forth new ideas to improve our existing practices and takes calculated risks to innovate new capabilities within Business Analytics, with a focus on data products and analytics solutions

Other Job Details:

  • Last Date to Apply for Job: November 18, 2024

  • Work Location Assignment: Hybrid. Must be able to work from assigned Pfizer office 2-3 days per week, or as needed by the business

The annual base salary for this position ranges from $74,900.00 to $124,800.00.* In addition, this position is eligible for participation in Pfizer’s Global Performance Plan with a bonus target of 7.5% of the base salary. We offer comprehensive and generous benefits and programs to help our colleagues lead healthy lives and to support each of life’s moments. Benefits offered include a 401(k) plan with Pfizer Matching Contributions and an additional Pfizer Retirement Savings Contribution, paid vacation, holiday and personal days, paid caregiver/parental and medical leave, and health benefits to include medical, prescription drug, dental and vision coverage. Learn more at Pfizer Candidate Site – U.S. Benefits | (uscandidates.mypfizerbenefits.com). Pfizer compensation structures and benefit packages are aligned based on the location of hire. The United States salary range provided does not apply to Tampa, FL or any location outside of the United States.* The annual base salary for this position in Tampa, FL ranges from $67,400.00 to $112,300.00.

Relocation assistance may be available based on business needs and/or eligibility.

Sunshine Act

Pfizer reports payments and other transfers of value to health care providers as required by federal and state transparency laws and implementing regulations.  These laws and regulations require Pfizer to provide government agencies with information such as a health care provider’s name, address and the type of payments or other value received, generally for public disclosure.  Subject to further legal review and statutory or regulatory clarification, which Pfizer intends to pursue, reimbursement of recruiting expenses for licensed physicians may constitute a reportable transfer of value under the federal transparency law commonly known as the Sunshine Act.  Therefore, if you are a licensed physician who incurs recruiting expenses as a result of interviewing with Pfizer that we pay or reimburse, your name, address and the amount of payments made currently will be reported to the government.  If you have questions regarding this matter, please do not hesitate to contact your Talent Acquisition representative.

EEO & Employment Eligibility

Pfizer is committed to equal opportunity in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, disability or veteran status.  Pfizer also complies with all applicable national, state and local laws governing nondiscrimination in employment as well as work authorization and employment eligibility verification requirements of the Immigration and Nationality Act and IRCA.  Pfizer is an E-Verify employer.  This position requires permanent work authorization in the United States.

Information & Business Tech



Source link

14Nov

Engineer – Fab10 Facilities Control at Micron Technology – Fab 10N/X, Singapore


Our vision is to transform how the world uses information to enrich life for all.

Join an inclusive team passionate about one thing: using their expertise in the relentless pursuit of innovation for customers and partners. The solutions we build help make everything from virtual reality experiences to breakthroughs in neural networks possible. We do it all while committing to integrity, sustainability, and giving back to our communities. Because doing so can fuel the very innovation we are pursuing.

JR63729 Engineer – Fab10 Facilities Control

As the Facilities Smart Facilities & Control Engineer, you are required to lead engineering and maintenance works in the fields of Facilities and Control systems on site, with guidance and direction from Facilities organization, to achieve facilities department, site, facilities corporate and company goals.
 

Job Responsibilities:

  • Lead on site Smart System transformation and on implementation of smart facilities projects with AI-enabled detection and analytics in collaboration with other Micron Departments.

  • Lead on Sustainability and Environmental Control system deployment with early detection and predictions, using Industry 4.0 system.

  • Lead on labor efficiency automation projects with innovative idea to improve facilities team efficiency.

  • Plan, design and lead Control Automation engineering projects includes new construction, system upgrading and quality improvement.

  • Lead and attend to abnormalities in support of Operations in a timely manner on system under Smart FAC & Control section when required.

  • Lead on the planning and supervise Preventive and Corrective Maintenance work activities, priorities and reallocate resource when required.

  • Perform capacity tracking and planning for Instrumentation and Control systems.

  • Acquire estimated cost and raise purchase requisition for service or parts required for repair and maintenance purpose, when required.

  • Use computer base software to document and analyze records of work order and equipment history.

Qualifications & Skills:

  • Bachelor’s Degree in Electrical/Electronics/Computer engineering or equivalent field.

  • Experience in programming of programmable logic controller (PLC) and process automation PID control.

  • Experience and Good knowledge of various handheld tool, portable and panel measuring device and equipment.

  • Experience in SCADA, HMI and reporting software.

  • Proficient in Microsoft office.

  • Ability to use and build schematic, single-line drawing and related equipment and system documentation

  • Good knowledge on Python, Java, C/C++, C#, SQL, machine learning software.

  • Ability to work in a team environment with various professional levels.

About Micron Technology, Inc.

We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all. With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience.

To learn more, please visit micron.com/careers

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status.

To request assistance with the application process and/or for reasonable accommodations, please contact

hr**********@mi****.com











Micron Prohibits the use of child labor and complies with all applicable laws, rules, regulations, and other international and industry labor standards.

Micron does not charge candidates any recruitment fees or unlawfully collect any other payment from candidates as consideration for their employment with Micron.



Source link

14Nov

Writing LLMs in Rust: Looking for an Efficient Matrix Multiplication | by Stefano Bosisio | Nov, 2024


Starting from Karpathy llm.c, I wonder myself “Could I write this in Rust?” Here are the lessons I learned and how I am writing llm.rust. In this first article, let’s tackle the matrix multiplication problem.

Image by GoogleDeepMind on Unsplash

Matrix multiplication may be the most important operation in Machine Learning. I still remember when I was an engineering student, and in one of the first linear algebra lessons, the teacher started to explain matrices, eigenvectors, and basis and orthonormal basis. I was very confused, my head took a little while to start understanding why we were bothering so much about matrices and basis sets, and what a good basis implies for our world. From there, I always found linear algebra so fascinating, and, from a pure computer science point of view, how amazing all those algorithms that try to be more and more efficient in handling matrices.

In particular, we know that the matrix-vector product is pretty simple, but things are getting more and more complicated when we have matrices-matrices or tensors-tensors products. From here, many methodologies have been implemented to optimize the matrix multiplication. For example, a long time ago I posted about DeepMind



Source link

14Nov

Research Associate – AI at Keywords Studios Plc – San Francisco, California, United States – Remote


Work Location: USA Remote (Prefer PST Time Zone)

Work Hours: M – F, 9:00 am – 5:30 pm

Pay Rate: $22 – $27/hr DOE and location

The Research Associate is a non-traditional role in which you will work on our data collection and quality team focusing on the improvement of an AI engine.

In addition to an ability to write clearly and concisely, successful Research Associates must be able to tailor their writing style to each assignment’s requirements, must possess solid research skills and be able to quickly paraphrase their findings, and will be called upon to evaluate Large Language Model (LLM) prompts written by others

The ideal candidates will have a solid ability to focus on efficiency and problem-solving, and excellent writing and reading comprehension skills – including experience in creating and composing text within a specified amount of time

This role provides opportunities for career advancement.

Note: There is no coding or software development as part of this role.

What You’ll Achieve (Responsibilities): 

  • Work collaboratively in a fast-paced environment
  • Work on various client projects to train generative AI models, by creating prompts and responses based on the instructions provided and on using established best practices for quality prompts
  • Given examples, generate similar prompts and responses
  • Execute different Use Cases collecting Data in support of AI engine
  • Fine-tune AI training prompts for more consistent results
  • Work with a small team of Data Specialists in annotation and labeling projects.·
  • Engage and assist in regular team training
  • Help identify areas for process improvements
  • Assist in documenting processes
  • Quantitative and Qualitative feedback
  • Provide feedback on tools being used and on potential alternatives
  • Use a variety of communication channels such as Slack, Teams, and SharePoint, to learn about new projects, collaborate with your team, and ask questions
  • Learn new software programs on the job
  • Providing supporting documentation when the AI fails

Keywords provides a competitive compensation package, good benefits and a casual, fun, productive and supportive working environment. We empower people to perform to the best of their ability with our “can do” attitude. We appreciate and embrace flexibility and learn at every opportunity to grow ourselves through experience, training and tackling new challenges. This is what makes us Keywordians.

Requirements

  • A candidate with a college degree in Computer Science, Math and Statistics, or Cognitive Science.
  • Preferred prior work experience or college studies in AI
  • Process-oriented, focused on problem-solving, an effective communicator, efficient, and highly organized, with strong attention to detail
  • An ability to learn, document, and work with the team on new technologies and processes
  • Ability to gain new skills and knowledge through hands-on experience
  • Experience in assisting in system troubleshooting & finding resolutions
  • Keen eye for detail
  • Strong Communication Skills (Oral and Written)
  • Demonstrated ability to work independently
  • Strong time management skills
  • Exemplify the quality of doing “get it done attitude,” including a high level of accountability, transparency, and teamwork first & foremost

Benefits

At KeyWords we provide all our contingent workforce with:

  • Paid Time Off (including sick days and holidays)
  • 401k (3% matching)
  • Medical, Dental and Vision benefits



Source link

14Nov

Cloud Consultant Intern, AWS Professional Services at Amazon.com – Seattle, Washington, USA


Note: Amazon internships are full-time (40 hours/week) for 12 consecutive weeks, starting May/June 2025. Applicants should have at a minimum one quarter/semester remaining after their internship concludes. This internship role is required to be onsite 5 days a week in Arlington, VA or Seattle, WA.

Do you want to experiment with innovative technologies, including Cloud Computing, Machine Learning, and GenAI? Are you passionate about educating, training, designing, and building cloud computing solutions to help customer solve their most challenging problems?

Amazon Web Services (AWS), a leader in Cloud Computing, is seeking interns to join our AWS Professional Services Internship program. This is a unique opportunity to play a key role in a fast-growing business and to deliver value to AWS customers of all sizes from startups to global brands. The skills and experiences you gain will be highly sought after throughout the industry and give you the opportunity for the career of a lifetime.

The AWS Professional Services organization is a global team of experts that helps AWS customers realize their desired business outcomes when using the AWS Cloud. We deliver focused guidance through our global specialty practices, which cover a variety of solutions, technologies, and industries. In addition to working alongside our customers, we share our experience through technical artifacts and proof of concept demos.

As a Professional Services intern, you will gain hands-on experience in cloud computing, develop business acumen, and learn about Amazon’s peculiar culture. You will work on projects, have the opportunity to obtain the AWS certifications, and attend professional development events. This internship leads to a full time role in Arlington, VA, that has up to 75% travel associated. While many of our customers engage ProServe virtually, there is always the potential that a customer might choose to include travel in their contract.

Upon successful completion of the internship program, selected interns will receive a full-time offer to join the Professional Services and go into our AWS Technical training program. This program is an accelerated career development program for recent graduates and early career professionals entering technical roles to advance their skills and help customers design and build flexible and resilient cloud-based solutions. While in this program, you will learn key technical and professional skills from top AWS subject matter experts that you can use in your career working with AWS’s customers.

Applications are reviewed on a rolling basis. For an update on your status, or to confirm your application was submitted successfully, please login to your candidate portal. Please note that we are reviewing a high volume of applications and appreciate your patience.

The program offers the opportunity to specialize in an area of interest including:

Application Developer – AppDev resources are specialists in designing applications that run natively in the cloud using programming languages such as Python, JavaScript, Typescript, etc. They are experts in building programs that run on any number of platforms including virtualized instances, containers, or server less architecture. They are full stack developers with backend, frontend, and UX skillsets.

Data & Analytics – Data & Analytics role supports our services that leverage data and produce business insights, which may include using Machine Learning/Artificial Intelligence/Generative Artificial Intelligence (ML/AI/GenAI). Helping our customers use and integrate Big Data services in what is arguably our industry’s most exciting space. The portfolio of services covers EMR (Hadoop), DynamoDB (NoSQL), MongoDB, Apache Cassandra, Amazon Q, and Bedrock.

Key job responsibilities
· Build solutions that demonstrate technical breadth and practical knowledge of applicable AWS services
· Present technical content to internal and external audiences and collaborate across diverse stakeholder groups to deliver business value for Customers.
· Contribute to workload discovery, requirements gathering, and preparation of engagement-related presentations.

About the team
Mentorship & Career Growth
Our team is dedicated to supporting new team members in an environment that celebrates knowledge sharing and mentorship. Projects and tasks are assigned in a way that leverages your strengths and helps you further develop your skillset.

Inclusive Team Culture
Here at AWS, we embrace our differences. We are committed to furthering our culture of inclusion. We have ten employee-led affinity groups, reaching 40,000 employees in over 190 chapters globally. We have innovative benefit offerings, and host annual and ongoing learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences.

Work/Life Harmony
Our team puts a high value on work-life harmony. It isn’t about how many hours you spend at home or at work; it’s about the flow you establish that brings energy to both parts of your life. We believe striking the right balance between your personal and professional life is critical to life-long happiness and fulfillment. We offer flexibility and encourage you to find your own balance between your work and personal lives.

Basic Qualifications

– Enrolled in an Associate or greater degree program (e.g. Bachelor degree, Master degree) in a STEM related field such as Computer Science, Computer Engineering, Information Technology, or other related fields, with a conferral date between December 2025 and June 2026
– Experience with Java and Python
– Experience with one or more of the following programming languages/technologies: Typescript, React, Angular, Node.js, Ruby, GoLang, R, C, or C++
– Experience with one or more of the following: networking fundamentals, security, databases (relational and/or NoSQL), operating systems (Unix, Linux, and/or Windows)

Preferred Qualifications

– Experience with software development lifecycle (SDLC) and agile/iterative methodologies
– Knowledge of the primary AWS services such as EC2, ELB, RDS, VPC, Route53, and S3
– Basic experience setting up cloud environment with AWS
– Experience with infrastructure as code, ops automation, and configuration management tools such as Chef, Puppet, or Ansible

Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. For individuals with disabilities who would like to request an accommodation, please visit https://www.amazon.jobs/en/disability/us.

Our compensation reflects the cost of labor across several US geographic markets. The base pay for this position ranges from $42.50/hr in our lowest geographic market up to $92.60/hr in our highest geographic market. Pay is based on a number of factors including market location and may vary depending on job-related knowledge, skills, and experience. Amazon is a total compensation company. Dependent on the position offered, equity, sign-on payments, and other forms of compensation may be provided as part of a total compensation package, in addition to a full range of medical, financial, and/or other benefits. For more information, please visit https://www.aboutamazon.com/workplace/employee-benefits. This position will remain posted until filled. Applicants should apply via our internal or external career site.



Source link

13Nov

Data Engineer at SK Battery America – Georgia


About SK Battery America (SKBA)

As the flagship entity of SK On and a leader in the electric vehicle (EV) battery manufacturing sector, SKBA is at the forefront of advancing high nickel technology. With an impressive revenue of $2 billion and dedicated team of over 3,000 employees, we provide high-energy density battery cells to top automotive brands like Ford, Volkswagen, and Hyundai Motors. SKBA’s North American expansion is marked by strategic joint partnerships with industry giants such as Ford and Hyundai, underscoring our commitment to the evolving EV landscape.

Role and Responsibilities

  • Managing data analysis system operation
  • Response properly in a timely manner against increasing requirements from technology and manufacturing team
  • Coming up with solutions to business problems using data-drive approaches
  • Using SQL for analyzing and organizing production raw data
  • Build view, data mart in database and data pipeline using SQL and ETL tool
  • Combine raw information from variable systems
  • Creating and managing dashboards and reports that communicate insights to stakeholders
  • Identify opportunities for data acquisition
  • Interpret trends and patterns from dashboards and materials
  • Conduct complex data analysis and share on result with SMEs
  • Collaborate and Communicate with not only process engineers but also management
  • Explore ways to enhance data quality and reliability

Basic Qualifications:

  • Bachelor’s degree in Computer Science, IT or related field.
  • 3+ years experience in a data engineering or data analysis or in a similar role
  • 3+ years experience with SQL and data modeling
  • 3+ years experience Hands-on experience with SQL database design
  • 1+ years experience with Data visualization Skill (i.e. Spotfire, Power BI, Tableau)
  • 1+ years experience IT System operation and maintenance
  • Knowledge of programming language, network and infrastructure

Nice to have:

  • Experience of data engineering or analysis in manufacturing business (Preferred)
  • HDFS, Kafka, NIfi, Apache Kudu and Impala experience (Preferred)
  • AI, ML field experience (Preferred)
  • Bi-lingual Korean a plus



Source link

13Nov

Summer Fellowship 2024 Wrap Up – What Did Our Fellows Work On?


Summer and Winter Fellowships provide an opportunity for early-career individuals and established professionals new to the field of AI governance to spend three months working on an AI governance research project, deepening their knowledge of the field, and forging connections with other researchers and practitioners. 

Our 2024 Summer Fellows came from a variety of disciplines and a range of prior experience – some fellows ventured into entirely new intellectual territory for their projects, while others used the time to extend their previous work.

We extend our sincere appreciation to all our supervisors for their dedicated mentorship and guidance this summer.

If you’re interested in applying for future fellowships, check out our Opportunities page. You can register your expression of interest here.



Source link

13Nov

Data Platform Architect at ServiceNow – Hyderabad, India


Company Description

It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today — ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone.

Job Description

ServiceNow is seeking a Data Platform Architect for Enterprise Data Team based out of Hyderabad. This is a senior individual contributor role with hands-on exposure into solution design for the end-end data ecosystem. The data platform architect will provide technical expertise to Analytics org on various strategic programs as well as Enterprise initiatives. The solution architect will lead the tech roadmap, technical evaluations, bring in architecture value additions, best practices, provide optimal design recommendations and help in building next gen analytics.

  • Platform Design and Architecture:
    • Lead the design and development of the data platform architecture, ensuring scalability, performance, reliability, and security
    • Define and implement Standards for data modeling, data integration, and data lifecycle management
    • Well versed with modern data platform stack with end-end coverage to build large scale Data and AI solutions
    • Create blueprints for data pipelines, data lakes, data warehouses, and analytical systems
    • Provide technical leadership in choosing appropriate technologies for data processing, cloud compute, and storage solutions
  • Technical Solutions and Roadmap:
    • Influence enterprise architecture design conversations and deliver sophisticated data solutions
    • Work closely with leaders, data engineers, data scientists, and analysts to define and refine data platform requirements
    • Lead cross-functional teams to develop and integrate new data products and solutions
    • Understand business needs and translate them into data solutions and architecture roadmap that add value to the organization
  • Cloud usage and Governance:
    • Design and implement cloud-based solutions for data processing and storage (e.g. Azure, Snowflake, Databricks, GCP etc)
    • Optimize cloud resources for cost efficiency, performance, and availability
    • Ensure the security and compliance of data platforms, addressing regulatory and privacy concerns
    • Develop strategies to enforce data governance policies, ensuring data quality, consistency, and integrity across systems
    • Design data security measures and control access to sensitive data through role-based access and encryption
  • Innovation & Continuous Improvement:
    • Stay up-to-date with emerging technologies and trends in data architecture, big data, cloud computing, and AI
    • Recommend and lead initiatives to improve the performance, scalability, and efficiency of data processing and storage systems
    • Act as the Data Architecture subject matter expert to drive the innovation for the company
  • Documentation and Technical design:
    • Produce detailed documentation for platform architecture, data models, and data workflows
    • Well versed with technical design, diagrams, and documentation tools

Qualifications

What you need to be successful in this role: 

  • Experience in designing and implementing end-to-end data platforms, including data lakes, data warehouses, and data integration pipelines.
  • Experience designing and developing low-latency and high-throughput enterprise grade data architecture ecosystem
  • Knowledge of relational and non-relational databases, and big data technologies (e.g., Hadoop, Spark, Kafka).
  • Expertise in cloud platforms Azure, Snowflake, Databricks, Github, Jenkins etc
  • Strong knowledge of ETL processes and tools for real-time data processing
  • Proficiency in building data solutions using tools like Apache Kafka, Apache Airflow, and dbt (Data Build Tool) and Python
  • Strong understanding of SQL and data querying best practices
  • Proficiency in managing and deploying solutions on cloud platforms such as Azure, Snowflake, Databricks
  • Experience with data encryption, privacy, and security best practices, including GDPR compliance
  • Excellent problem-solving and communication skills
  • Strong scripting skills in Python, Shell, or similar languages for automation and process optimization
  • Familiarity with CI/CD pipelines, version control (Git), and deployment automation tools (Jenkins, Terraform)
  • Familiarity with BI tools such as Tableau, Power BI, or Looker, as well as experience working with data scientists and analysts to support analytical workloads

Additional Information

Work Personas

We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work. Learn more here.

Equal Opportunity Employer

ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements. 

Accommodations

We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact

gl************@se********.com











for assistance. 

Export Control Regulations

For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities. 

From Fortune. ©2024 Fortune Media IP Limited. All rights reserved. Used under license. 



Source link

13Nov

Software Engineer III – Full Stack Developer (Python & React) at JPMorgan Chase & Co. – Bengaluru, Karnataka, India


We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III at JPMorgan Chase within the Asset and Wealth Management Technology division, you will be given the chance to improve, design, and implement the software elements of our cutting-edge technology products in a secure, stable, and scalable manner. As a rising member of a software engineering team, you will be responsible for executing software solutions by designing, developing, and troubleshooting various components within a technical product, application, or system. This role will provide you with the necessary skills and experience for growth. A key aspect of this position involves developing the application stack to facilitate analytics/AI solutions, which will be instrumental in shaping the experiences of our clients and employees worldwide.

Job Responsibilities

  • Design, deploy and manage software solutions, design, development, and technical troubleshooting for various solutions in the financial services domain
  • Write secure and high-quality code with limited guidance
  • Design, develop, code, and troubleshoots with consideration of upstream and downstream systems and technical implications
  • Apply knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation
  • Collaborate with cross-functional teams to identify requirements and develop solutions to meet business needs within the organization
  • Communicate effectively with both technical and non-technical stakeholders, demonstrating excellent communication skills

 

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages
  • Programming skills in Python with experience in frontend scripting languages such as Node.JS & React
  • Exposure to and understanding of agile methodologies such as CI/CD, Application Resiliency, and Security
  • Knowledge of deployment processes, including experience with GIT and version control systems for efficient collaboration and code management 
  • Familiarity with data structures and algorithms, enabling effective problem-solving and optimization 
  • Understanding of the software development lifecycle, with a focus on incorporating analytics / AI components and adhering to best practices in version control and code quality
  • Demonstrated knowledge of software applications and technical processes within a cloud deployment 
  •  Excellent problem-solving and the ability to communicate ideas and results to stakeholders and leadership in a clear and concise manner

Preferred qualifications, capabilities, and skills

  • Master’s or PhD in Computer Science, Data Science or related field
  • Experience in developing and deploying production-grade analytics solutions in the financial services industry
  • Knowledge of financial products and services including trading, investment and risk management
  • Experience in data pre-processing, feature engineering, and data analysis would be beneficial
  • Experience in developing APIs and integrating AI models into software applications

 



Source link

Protected by Security by CleanTalk