PoC Articles - Enterprise Knowledge https://enterprise-knowledge.com/tag/poc/ Mon, 03 Nov 2025 21:25:55 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://enterprise-knowledge.com/wp-content/uploads/2022/04/EK_Icon_512x512.svg PoC Articles - Enterprise Knowledge https://enterprise-knowledge.com/tag/poc/ 32 32 LLM Solutions PoC to Production: From RAGs to Riches (Part 1) https://enterprise-knowledge.com/llm-solutions-poc-to-production-from-rags-to-riches-part-1/ Wed, 30 Jul 2025 19:14:23 +0000 https://enterprise-knowledge.com/?p=25063 In the past year, many of the organizations EK has partnered with have been developing Large Language Model (LLM) based Proof-of-Concepts (PoCs). These projects are often pushed for by an enthusiastic IT Team, or internal initiative – with the low … Continue reading

The post LLM Solutions PoC to Production: From RAGs to Riches (Part 1) appeared first on Enterprise Knowledge.

]]>
In the past year, many of the organizations EK has partnered with have been developing Large Language Model (LLM) based Proof-of-Concepts (PoCs). These projects are often pushed for by an enthusiastic IT Team, or internal initiative – with the low barrier to entry and cost in LLM development making it an easy project for executives to greenlight. Despite initial optimism, these LLM PoCs rarely reach the enterprise-grade implementations promised due to factors such as organizational buy-in, technical complexity, security concerns, misalignment on content readiness for AI solutions, and a lack of investment in key infrastructure. For example, Gartner has predicted that 30% of GenerativeAI projects will be abandoned after PoC by the end of 2025. This blog provides an overview of EK’s approach to evaluating and roadmapping an LLM solution from PoC to production, and highlights several dimensions important to successfully scaling an LLM-based enterprise solution.

 

Organizational Implementation Considerations:

Before starting on the technical journey from “RAGs to Riches”, there are several considerations for an organization before, during, and after creating a production solution. By taking into account each of these considerations, a production LLM solution has a much higher chance of success.

Before: Aligning Business Outcomes

Prior to building out a production LLM solution, a team will have developed a PoC LLM solution that is able to answer a limited set of use cases. Before the start of production development, it is imperative that business outcomes and the priorities of key stakeholders are aligned with project goals. This often looks like mapping business outcomes – such as enhanced customer interactions, operational efficiency, or reduced compliance risk to quantifiable outcomes such as shorter response times and findability of information. It is important to ensure these business goals translate from development to production and adoption by customers. Besides meeting technical functionality, setting up clear customer and organizational goals will help to ensure the production LLM solution continues to have organizational support throughout its entire lifecycle.

During: Training Talent and Proving Solutions

Building out a production LLM solution will require a team with specialized skills in natural language processing (NLP), prompt engineering, semantic integration, and embedding strategies. In addition, EK recommends investing resources into content strategists and SMEs who understand the state of their organization’s data and/or content. These roles in particular are critical to help prepare content for AI solutions, ensuring the LLM solution has comprehensive and semantically meaningful content. Organizations that EK has worked with have successfully launched and maintained production LLM solutions by proactively investing in these skills for organizational staff. This helps organizations build resilience in the overall solution, driving success in LLM solution development.

After: Infrastructure Planning and Roadmapping

To maintain a production LLM solution after it has been deployed to end-users, organizations must account for the infrastructure investments and operational costs needed, as well as necessary content and data maintenance. Some of these resources might include enterprise licensing, additional software infrastructure, and ongoing support costs. While many of these additional costs can be mitigated by effectively aligning business outcomes and training organizational talent, there still needs to be a roadmap and investment into the future infrastructure (both systematically and content-wise) of the LLM production solution.

 

Technical Criteria for Evaluating LLM PoCs:

In parallel with the organizational implementation considerations, and from EK’s depth of experience in developing LLM MVPs, designing enterprise AI architecture, and implementing more advanced LLM solutions such as Semantic RAG, EK has developed 7 key dimensions that can be used to evaluate the effectiveness of an LLM PoC:

Figure 1: Dimensions for Evaluating an LLM Solution

1. Depth of Interaction: refers to how deeply and dynamically users can engage with the LLM solution. At a lower level, interaction might simply involve asking questions and receiving direct answers, while at the highest level, intelligent agents act on behalf of the user autonomously to leverage multiple tools and execute tasks.

2. Freshness of Information: describes how frequently the content and data behind the semantic search solution are updated and how quickly users receive these updates. While lower freshness implies data updated infrequently, at higher freshness levels, data is updated frequently or even continuously which helps to ensure users are always interacting with the most current, accurate, and updated information available.

3. Level of Explanation: refers to how transparently the LLM solution communicates the rationale behind its responses. At a lower level of explanation, users simply are receiving answers without clear reasoning. In contrast, a high level of explanation would include evidence, citations, audit trails, and a clear path on how information was retrieved. 

4. Personalization, Access & Entitlements Requirements: describes how specifically content and data are tailored and made accessible based on user identity, roles, behavior, or needs. At lower levels, content is available to all users without personalization or adaptations. At higher levels, content personalization is integrated with user profiles, entitlements, and explicit access controls, ensuring users only see highly relevant, permissioned content. 

5. Accuracy of Information: refers to how reliably and correctly the LLM solution can answer user queries. At lower levels, users receive reasonable answers that may have minor ambiguities or occasional inaccuracies. At the highest accuracy level, each response is traced back to original source materials and are cross-validated with authoritative sources. 

6. Enterprise Agentic Support: describes how the LLM solution interacts with the broader enterprise AI ecosystem, and coordinates with other AI agents. At the lowest level, the solution acts independently without any coordination with external AI agents. At the highest level, the solution seamlessly integrates as a consumer and provider within an ecosystem of other intelligent agents.

7. Enterprise Embedding Strategy: refers to how the LLM solution converts information into vector representations (embeddings) to support retrieval. At a lower level embeddings are simple vector representations with minimal or no structured metadata. At the highest levels, embeddings include robust metadata and are integrated with enterprise context through semantic interpretation and ontology-based linkages. 

For an organization, each of the technical criteria will be weighed differently based on the unique use cases and requirements of the LLM solution. For example, an organization that is working on a content generation use case could have a greater emphasis on Level of Explanation and Freshness of Information while an organization that is working on an information retrieval use case may care more about Personalization, Access, & Entitlements Requirements. This is an integral part of the evaluation process, with an organization coming to agreement on the level of proficiency needed within each factor. Leveraging this standard, EK has worked with organizations across various industries and diverse LLM use cases to optimize their solutions.

Additionally, EK recommends that an organization undergoing an LLM PoC evaluation also conduct an in-depth analysis of content relevant to their selected use case(s). This enables them to gain a more comprehensive understanding of its quality – including factors like completeness, relevancy, and currency – and can help unearth gaps in what the LLM may be able to answer. All of this informs the testing phase by guiding the creation of each test, as well as the expected outcomes, and can be generally categorized across three main areas of remediation:

  • Content Quality – The content regarding a certain topic doesn’t explicitly exist and is not standardized – this may necessitate creating dummy data to enable certain types of tests.
  • Content Structure – The way certain content is structured varies – we can likely posit that one particular structure will give more accurate results than another. This may necessitate creating headings to indicate clear hierarchy on pages, and templates to consistently structure content. 
  • Content Metadata – Contextual information that may be useful to users is missing from content. This may necessitate establishing a taxonomy to tag with a controlled vocabulary, or an ontology to establish relationships between concepts. 

 

Technical Evaluation of LLM PoCs In Practice:

Putting the organizational implementation and technical considerations into practice, EK recently completed an engagement with a leading semiconductor manufacturer, employing the standard process for evaluating their PoC LLM search solution. The organization had developed a PoC search solution that was being utilized for answering questions against a series of user-selected PDFs relating to the company’s technical onboarding documentation. EK worked with the organization  to align on key functional requirements via a capability assessment for a production LLM solution based on the 7 dimensions EK has identified. Additionally, EK completed a simultaneous analysis of in-scope content for the use case. The results of this content evaluation informed which content components should be prioritized and candidates for the testing plan.

After aligning on priority requirements, in this case, accuracy and freshness of information, EK developed and conducted a testing plan for parts of the PoC LLM. To operationalize the testing plan, EK created a four-phase RAG Evaluation & Optimization Workflow to turn the testing plan into actionable insights.This workflow helped produce a present-state snapshot of the LLM solution, a target-state benchmark, and a bridging roadmap that prioritizes retriever tuning, prompt adjustments, and content enrichment. Based on the workflow results, stakeholders at the organization were able to easily interpret how improved semantics, content quality, structure, and metadata would improve the results of their LLM search solution.

In the following blogs of the “RAGs to Riches” series, EK will be explaining the process for developing a capability assessment and testing plan for LLM based PoCs. These blogs will expand further on how each of the technical criteria can be measured as well as how to develop long-term strategy for production solutions.

 

Conclusion

Moving an LLM solution from proof-of-concept to enterprise production is no small feat. It requires careful attention to organizational alignment, strong business cases, technical planning, compliance readiness, content optimization, and a commitment to ongoing talent development. Addressing these dimensions systematically will ensure that your organization will be well positioned to turn AI innovation into a durable competitive advantage.

If you are interested in having EK evaluate your LLM-based solution, and help build out an enterprise-grade implementation contact us here

The post LLM Solutions PoC to Production: From RAGs to Riches (Part 1) appeared first on Enterprise Knowledge.

]]>
Graph Solutions PoC to Production: Overcoming the Barriers to Success (Part I) https://enterprise-knowledge.com/graph-solutions-poc-to-production-overcoming-the-barriers-to-success-part-i/ Thu, 15 May 2025 13:16:55 +0000 https://enterprise-knowledge.com/?p=24334 Part I: A Review of Why Graph PoCs Struggle to Demonstrate Success or Progress to Production This is Part 1 of a two-part series on graph database PoC success and production deployment.   Introduction I began my journey with graphs … Continue reading

The post Graph Solutions PoC to Production: Overcoming the Barriers to Success (Part I) appeared first on Enterprise Knowledge.

]]>
Part I: A Review of Why Graph PoCs Struggle to Demonstrate Success or Progress to Production

This is Part 1 of a two-part series on graph database PoC success and production deployment.

 

Introduction

I began my journey with graphs around 2014 when I discovered network theory and tools like NetworkX and Neo4j. As our world becomes increasingly connected, it makes sense to work with data by leveraging its inherent connections. Soon, every problem I faced seemed ideally suited for graph solutions.

Early in my experiences, I worked with a biotech startup, exploring how graphs could surface insights into drug-protein interactions (DPI). The team was excited about graphs’ potential to reveal latent signals that traditional analytics missed. With a small budget, we created a Proof-of-Concept (PoC) to demonstrate the “art of the possible.” After a quick kick-off meeting, we loaded data into a free graph database and wrote queries exploring the DPI network. In just three months, we established novel insights that advanced the team’s understanding.

Despite what we considered success, the engagement wasn’t extended. More troubling, I later learned our PoC had been put into a production-like environment where it failed to scale in performance or handle new data sources. What went wrong? How had we lost the potential scientific value of what we’d built?

This experience highlights a common problem in the graph domain: many promising PoCs never make it to production. Through reflection, I’ve developed strategies for avoiding these issues and increasing the likelihood of successful transitions to production. This blog explores why graph PoCs fail and presents a holistic approach for success. It complements the blog Why Graph Implementations Fail (Early Signs & Successes).

Why Graph Database Solutions and Knowledge Graph PoCs Often Fail

Organizational Challenges

Lack of Executive Sponsorship and Alignment

Successful production deployments require strong top-level support. Without executive buy-in, graph initiatives seldom become priorities or receive funding. Executives often don’t understand the limitations of existing approaches or the paradigm shift that graphs represent.

The lack of sponsorship is compounded by how graph practitioners approach stakeholders. We often begin with technical explanations of graph theory, ontologies, and the differences between Resource Description Framework (RDF) and Label Property Graphs (LPG), rather than focusing on business value. No wonder executives struggle to understand why graph initiatives deserve funding over other projects. I’ve been guilty of this myself, starting conversations with “Let me tell you about Leonhard Euler and graph theory…” instead of addressing business problems directly.

Middle Management Resistance and Data Silos

Even with executive interest, mid-level managers can inhibit progress. Many have vested interests in existing systems and fear losing control over their data domains. They’re comfortable with familiar relational databases and may view knowledge graphs as threats to their “systems of record”. This presents an opportunity to engage managers and demonstrate how graphs can integrate with existing systems and support their goals.For example, a graph database may load data “just in time” to perform a connected data analysis and then drop the data after returning the analytic results. This would be an ephemeral use of graph analytics.

Bureaucracy and Data Duplication Concerns

Large organizations have lengthy approval processes for new technologies. Infrastructure teams may be reluctant to support experimental technology without an established return on investment  (ROI).

A critical but often undiscussed factor is that graph databases typically require extracting data from existing sources and creating another copy—raising security risks, infrastructure costs, and data synchronization concerns. This is the Achilles heel of graph databases. However, emerging trends in decoupling data from query engines may offer alternatives to this problem. A new paradigm is emerging where data in data lakes can be analyzed through a graph lens at rest without an ETL ingestion into a graph database. Graph query engines enable data to be viewed through traditional relational and now connected data lenses.

Isolated Use Cases and Limited Understanding

Many graph initiatives start as isolated projects tackling narrow use cases. While this limits upfront risk, it can make the impact seem trivial. Conventional technologies might solve that single problem adequately, leading skeptics to question whether a new approach is needed. The real value of knowledge graphs emerges when connecting data across silos—something that’s hard to demonstrate in limited-scope PoCs.

A practical approach I’ve found effective is asking stakeholders to diagram their problem at a whiteboard. This naturally reveals how they’re already thinking in graph terms, making it easier to demonstrate the value of a graph approach.

Talent and Skills Gap

Graph technologies require specialized skills that are in short supply. Learning curve issues affect even experienced developers, who must master new query languages and paradigms. This shortage of expertise can lead to reliance on a few key individuals, putting projects at risk if they leave.

 

Technical Challenges

Complex Data Modeling

Graph data models require a different mindset than relational schemas. Designing an effective graph schema or ontology is complex, and mistakes can lead to poor performance. Equally, an effective semantic layer is critical to understanding the meaning of an organization’s data. The schema-less flexibility of graphs can be a double-edged sword—without careful planning, a PoC might be built ad-hoc and prove inefficient or lead to data quality issues when scaled up. Refactoring a graph model late in development can be a major undertaking that casts doubt on the technology itself.

Integration Challenges

Enterprise data rarely lives in one place. Integrating graphs and other solutions with legacy systems requires extensive data mapping and transformation. Without smooth interoperability via connectors, APIs, or virtualization layers, the graph becomes an isolated silo with limited production value. Decoupled approaches mentioned above address this solution by focusing on graph and connected data analytics as a standalone feature of graph query engines. Tooling optimized for graphs are making ETL and integration of graph databases easier and more efficient.

Performance Trade-offs

Graph databases excel at traversing complex relationships but may underperform for simple transactions compared to optimized relational databases. In a PoC with a small dataset, this may not be immediately noticeable, but production workloads expose these limitations. As data volumes grow, even traversals that were fast in the PoC can slow significantly, requiring careful performance tuning and possibly hybrid approaches.

Evolving Standards and Tooling

The graph ecosystem is still evolving, with multiple database models and query languages (Cypher, Gremlin, SPARQL). More recently, decoupled graph query engines enable analyzing tabular and columnar data as if it were a graph, supporting the concept of “Single Copy Analytics” and concurrently increasing the breadth of options for graph analytics. Unlike the relational world with SQL and decades of tooling, graph technologies lack standardization, making it difficult to find mature tools for monitoring, validation, and analytics integration. This inconsistency means organizations must develop more in-house expertise and custom tools. 

Production Readiness Gaps

Production deployment requires high availability, backups, and disaster recovery—considerations often overlooked during PoCs. Some graph databases lack battle-tested replication, clustering, and monitoring solutions. Integrating with enterprise logging and DevOps pipelines requires additional effort that can derail production transitions. In the next blog on this topic, we will present strategies for integrating logging into a PoC and production releases.

Scaling Limitations

Graph databases often struggle with horizontal scaling compared to relational databases. While this isn’t apparent in small PoCs, production deployment across multiple servers can reveal significant challenges. As graphs grow larger and more complex, query performance can degrade dramatically without careful tuning and indexing strategies. We will explore how to thoughtfully scale graph efforts in the next blog on taking projects from PoC to Production.

 

Security and Compliance Challenges

Access Control Complexity

Graphs connect data in ways that complicate fine-grained access control. In a relational system, you might restrict access to certain tables; in a graph, queries traverse multiple node types and relationships. Implementing security after the fact is tremendously complex. Demonstrating that a graph solution can respect existing entitlements and implement role-based access control is crucial. 

Sensitive Data and Privacy Risks

Graphs can amplify privacy concerns because of their connected nature. An unauthorized user gaining partial access might infer much more from relationship patterns. This interconnectedness raises security stakes—you must protect not just individual data points but relationships as well.

Regulatory Compliance

Regulations like GDPR, HIPAA, or PCI present unique challenges for graphs. For instance, GDPR’s “right to be forgotten” is difficult to implement when deleting a node might leave residual links or inferred knowledge. Auditing requires tracking which relationships were traversed, and demonstrating data lineage becomes complex. If compliance wasn’t planned for in the PoC, retrofitting it can stall production deployment.

 

Financial and ROI Challenges

Unclear Business Value

Justifying a graph project financially is tricky, especially when benefits are long-term or indirect. A PoC might show an interesting capability, but translating that into clear ROI is difficult if only one use case is demonstrated. Without a strong business case tied to measurable Key Performance Indicators (KPIs), projects struggle to secure production funding.

Scaling Costs

PoCs often leverage free or low-cost resources. However, production deployment requires enterprise licenses, robust infrastructure, and high-availability configurations. An enterprise-level knowledge graph spanning multiple use cases can incur significant long-term costs. These financial requirements can shock organizations that didn’t plan for them.

Operational and Talent Expenses

Beyond technology costs, successfully operating a knowledge graph requires specialized talent—data engineers, knowledge engineers, and graph database administrators. While a PoC might be built by a single person or small team, maintaining a production graph could require several dedicated staff. This represents a significant ongoing expense that organizations often underestimate.

Competing Priorities

Every project competes for finite resources. Graph initiatives promise strategic long-term benefits but may seem less immediately impactful than customer-facing applications. Organizations focused on quarterly results may abandon graph projects if they don’t show quick wins. Breaking the roadmap into phased deliverables demonstrating incremental value can help maintain support.

 

Data Governance and Scalability Challenges

Ontology and Data Stewardship

Knowledge graphs require consistent definitions across the enterprise. Many organizations lack ontology expertise, leading to inconsistent data modeling. Strong governance is essential to manage how data elements are defined, connected, and updated. Without data stewards responsible for accuracy, production graphs can become unreliable or inconsistent, undermining user trust.

 

Conclusion

Transitioning a graph database or knowledge graph from PoC to production involves multiple challenges across organizational, technical, security, financial, governance, and talent dimensions. Many promising PoCs fail to cross this “last mile” due to one or more of these issues.

In Part Two, I’ll outline a holistic strategy for successful graph initiatives that can effectively transition to production—incorporating executive alignment, technical best practices, emerging trends like GraphRAG and semantic layers, and the critical people-process factors that make the difference between a stalled pilot and a thriving production deployment.

The post Graph Solutions PoC to Production: Overcoming the Barriers to Success (Part I) appeared first on Enterprise Knowledge.

]]>
Learning Strategy Assessment & POC https://enterprise-knowledge.com/learning-strategy-assessment-poc/ Wed, 29 Nov 2023 17:30:05 +0000 https://enterprise-knowledge.com/?p=19324 Our Learning Strategy Proof of Concept (POC) aligns your learning strategy with business goals and efficiently validates improvements with a clickable POC. Prove the value of your updated learning strategy, leverage observed outcomes, and take actionable steps to transform your … Continue reading

The post Learning Strategy Assessment & POC appeared first on Enterprise Knowledge.

]]>
Our Learning Strategy Proof of Concept (POC) aligns your learning strategy with business goals and efficiently validates improvements with a clickable POC. Prove the value of your updated learning strategy, leverage observed outcomes, and take actionable steps to transform your learning ecosystem for maximum business impact.

Why Learning Strategy Assessment & POC

We ensure that your investment in learning and development yields substantial returns, empowering your workforce and driving business growth. We accomplish this by:

  • Conducting a Learning Strategy Current State Assessment:  We identify business objectives for the learning program, define and prioritize learning and performance outcomes best aligned to those business objectives, and conduct a targeted review of relevant learning content.
  • Designing the POC:  The design of the learning POC includes an evaluation plan, recommended methods for achieving the target state of learning, a definition of prioritized learning asset types, and a definition of the learning technologies necessary to accomplish the POC’s objectives.
  • Piloting and Evaluating the POC:  We transform and develop collaborative learning content according to the design plan, followed by the launch of the POC to a pilot group to evaluate POC results.

Download the EK Learning Strategy Assessment

Outcomes

EK’s Learning Strategy Assessment & POC can help your organization:

  • Make informed decisions, measure progress, and continuously improve learning and performance outcomes.
  • Gain a competitive edge as your workforce becomes better equipped to meet challenges and drive success.
  • More efficiently evaluate and iterate, leading to enhanced organizational learning outcomes and a more substantial return on investment.

There are three available sizes of Learning Strategy Assessment & POC engagements that are built to best fit your organization’s budget and needs, each of which can be viewed with more detail at the link. Are you looking to upgrade and modernize your learning strategy? Contact us here.

The post Learning Strategy Assessment & POC appeared first on Enterprise Knowledge.

]]>
Let’s Talk Personalization: EK’s Joe Hilger to Speak at Upcoming Webinar “Opportunities And Outcomes From Personalizing Content” https://enterprise-knowledge.com/lets-talk-personalization-eks-joe-hilger-to-speak-at-upcoming-webinar-opportunities-and-outcomes-from-personalizing-content/ Fri, 10 Feb 2023 15:03:53 +0000 https://enterprise-knowledge.com/?p=17517 Joe Hilger, COO of Enterprise Knowledge (EK), will join Kevin Nichols for the live webinar Opportunities And Outcomes From Personalizing Content. Hilger and Nichols will discuss what personalization means in the Knowledge Management (KM) space and how to apply KM … Continue reading

The post Let’s Talk Personalization: EK’s Joe Hilger to Speak at Upcoming Webinar “Opportunities And Outcomes From Personalizing Content” appeared first on Enterprise Knowledge.

]]>
Joe Hilger, COO of Enterprise Knowledge (EK), will join Kevin Nichols for the live webinar Opportunities And Outcomes From Personalizing Content. Hilger and Nichols will discuss what personalization means in the Knowledge Management (KM) space and how to apply KM in different business cases. They will also address best practices when implementing a personalization content strategy and explore how to harness the power of personalization to transform the way employees, partners, and customers interact with personalized content. By the conclusion of the webinar, listeners will be able to:

  • Scale and expand a Proof of Concept from one business use case;
  • Prepare their organization for personalization; and
  • Apply knowledge graphs to scale personalized content.

 

Leveraging EK’s experience implementing personalization tools and our PoC techniques, Hilger and Nichols will discuss some of the specific impacts personalization has had on Content Management Systems. Join them for Opportunities And Outcomes From Personalizing Content on March 15th, from 1:00 PM – 2:00 PM ET on BrightTalk.

The post Let’s Talk Personalization: EK’s Joe Hilger to Speak at Upcoming Webinar “Opportunities And Outcomes From Personalizing Content” appeared first on Enterprise Knowledge.

]]>
EK’s Joe Hilger to Speak at Upcoming Webinar “Conducting A Personalization Proof-of-Concept (PoC)” https://enterprise-knowledge.com/eks-joe-hilger-to-speak-at-upcoming-webinar-conducting-a-personalization-proof-of-concept-poc/ Mon, 23 Jan 2023 16:44:10 +0000 https://enterprise-knowledge.com/?p=17297 Joe Hilger, COO of Enterprise Knowledge (EK), will join Scott Abel for the live webinar “Conducting A Personalization Proof-of-Concept (PoC).” Hilger and Abel will discuss what personalization means in the Knowledge Management (KM) space and how to apply KM in … Continue reading

The post EK’s Joe Hilger to Speak at Upcoming Webinar “Conducting A Personalization Proof-of-Concept (PoC)” appeared first on Enterprise Knowledge.

]]>
Joe Hilger, COO of Enterprise Knowledge (EK), will join Scott Abel for the live webinar “Conducting A Personalization Proof-of-Concept (PoC).” Hilger and Abel will discuss what personalization means in the Knowledge Management (KM) space and how to apply KM in different business use cases. Their webinar will also address best practices when first implementing a personalization content strategy and explore how to harness the power of personalization to transform the way employees, partners, and customers interact with your organization. By the conclusion of the webinar, listeners will be able to:

  • Prioritize use cases to narrow their PoC to a vertical slice;
  • Engineer content structure, metadata model, and schema for personalization;
  • Componentize content for reusability; and
  • Apply knowledge graphs to scale personalized content.

 

Leveraging EK’s experience implementing personalization tools and our PoC techniques, Hilger and Abel will discuss some of the specific impacts personalization has had on Content Management Systems. Join them for “Conducting A Personalization Proof-of-Concept” on January 24th, from 1:00 PM – 2:00 PM ET on BrightTalk

The post EK’s Joe Hilger to Speak at Upcoming Webinar “Conducting A Personalization Proof-of-Concept (PoC)” appeared first on Enterprise Knowledge.

]]>