乌鸦传媒

Skip to Content
Client story

TE Connectivity boosts product development with a knowledge hub

Client: TE Connectivity
Region: USA
Industry: High-tech

Gen AI-powered research, developed in partnership with 乌鸦传媒 and AWS, gives engineers access to previously siloed data

Client challenge: Product development teams needed to sift through countless documents scattered across dozens of incompatible systems to conduct background research.
Solution: 乌鸦传媒 and AWS worked with TE Connectivity to create a Gen AI-powered platform that consolidates all internal research within an intuitive UI.
Benefits:

  • Productivity increased five to 10 times for product development
  • 2.5 million documents ingested in three months
  • Access granted to 8,000 engineers at launch

TE Connectivity (TE), formerly known as Tyco Electronics and now a global leader in connectivity and sensing solutions, has distinguished itself from the competition with the kind of cutting-edge industrial technology that makes our modern world possible. A broad range of industries, including automotive, aerospace, energy, consumer electronics, and healthcare, rely on TE鈥檚 engineering expertise and innovations to transform their operations. Every year, the company produces more than 235 billion parts in 140 factories around the world.

Closely tracking recent breakthroughs in generative AI, TE was enthusiastic about how the technology could help search and summarize internal documents. These feature important proprietary information, and the company needed them to be readily available to staff.

During the request-for-proposal process, 乌鸦传媒 and Amazon Web Services (AWS) proposed building a solution that would harmonize TE鈥檚 diverse datasets, establish a central repository, and build an intuitive chat function in a clean user interface (UI).

鈥淏ased on that flexibility, that background, that proven experience, we felt 乌鸦传媒 and AWS were right for us. We certainly haven鈥檛 been disappointed. That was the right choice,鈥 said Phil Gilchrist, Chief Transformation Officer at TE.

The development process: Building an LLM with proprietary data

With 75 million engineering documents spread across 66 different databases, it was difficult for TE鈥檚 research and development (R&D) teams to find the right information. The scattered nature of the information also meant subject matter experts often had to answer questions about specific projects when researchers could not locate relevant reference documents.

There was a tight deadline for producing a solution. In just over three months, the team ingested a wealth of marketing and operational data and 2.5 million engineering documents 鈥 many of which needed to be scrubbed 鈥 into a modern UI. AWS supplied the cloud infrastructure and fully managed services, such as Amazon Bedrock, that enable the integration of high-performing, Gen AI foundation models, and Amazon OpenSearch Service, which makes it easy to deploy and operate various search, analytics, and visualization capabilities.

Meanwhile, 乌鸦传媒 used retrieval-augmented generation (RAG), an architectural approach for retrieving response from large language models (LLMs), to integrate these services into an enterprise-scale solution.

鈥淲hat they were able to do was launch a safe, secure solution into our security framework and our document structure almost immediately that鈥檚 very scalable, very secure, and something that so far has had a high quality of operation,鈥 Gilchrist said.

A cross-company team continues to improve the tool on an almost daily basis with the ingestion of additional content (including all 75 million engineering documents by April 2025) and tweaks based on user feedback.

鈥淗onestly, the team worked flawlessly together to stay on it,鈥 Gilchrist said. 鈥淓veryone really wanted to make it work and we did make it work. So, I would say based on that common objective and a tight timeline, we had no choice but to work very closely together, and that was a fantastic experience.鈥

The transformative solution: TELme

The result was TELme, a conversational platform powered by Gen AI that collects and organizes the company鈥檚 diverse pool of internal knowledge about various industries and products in a single place.

TELme is TE鈥檚 Gen AI implementation based on Claude 3.5, the AI assistant created by Anthropic, and trained on proprietary data. It is all organized under a single application programming interface.

鈥淔inding the right document was like finding a needle not just in one haystack but in 66 haystacks,鈥 Gilchrist said. 鈥淭ELme allows us to remove the haystacks and just find the needles.鈥

TELme establishes continuity and allows the company to hand down knowledge from one generation to the next. 鈥淲e believe TELme will come to represent the sum of intellectual knowledge of the company in one form or another. But not only that: it鈥檚 a knowledge base that can be put into action. What that LLM will enable them to do is find the right piece of information right up front within seconds, rather than within a morning of trawling through extraneous documents.鈥

TELme goes beyond knowledge management. It provides an enterprise-wide research environment that fosters collaboration and communication.

    Generative AI

    As generative AI continues to advance, early adopter organizations will benefit from reinvented business models and processes.

    High-tech

    The world will never be the same. Will your company react to the next wave of disruption 鈥 or build what鈥檚 next?