Blog
/
/
How Libertex Group is continuously improving its developer experience using Port
Case Study
Case Study

How Libertex Group is continuously improving its developer experience using Port

Mar 19, 2024
Sooraj Shah

As part of a product mindset shift, Libertex Group has selected Port to overhaul its developer experience 

The success of the Libertex group lies in its technology capabilities. Its trading platform has many moving parts: microservices, APIs, pipeline and cloud resources as well as tooling for AppSec and more.

Libertex took on the challenge of managing this complexity head-on and implemented a well known enterprise architecture management tool (EAM) to create a large scale enterprise architecture repository.

This would replace the static spreadsheets that developers had previously been using. But after three years, it was clear that this wasn’t enough.

The challenges of using an enterprise architecture management tool

The enterprise architecture management approach had significant shortcomings:

  •  It was rigid and opinionated, meaning that while Libertex had many types of assets, it was limited to only what the tool contained.

“We couldn’t create new entities, so we couldn’t model the complexity of our organization. We had several thousand assets - applications, components, interfaces, data objects, which were in the registry but we were unable to add more,” said Libertex’s head of enterprise architecture, Alexander Bukarev.

  • It required manual work and didn’t update automatically, requiring useless toil.

“We had to maintain all of this manually with only 10 architects,” said Bukarev.

  • Over time, they began to suspect the data quality inside the enterprise architecture system was not up to scratch, since manual updates and opinionated entity categories consistently compromised the quality of the data and made it questionable and dated. As a result, the team couldn’t track the lifecycle of components and all of this led to an increasing cognitive load on developers, SRE engineers and others.

The team decided a new approach was needed and set its sights on internal developer portals. This search for a new solution that could aid developers, architects and others, coincided with a transformation within the company - there had been changes in organizational structure, as well as a belief in fostering a product management approach. This product-centric lens would ensure that the team would focus on the users’ pain points when looking for a new solution, and involve them during the scoping phase.

In addition, the team wanted to ensure the portal would be able to support improved data quality and sought a solution that could automate data collection, while implementing constraints and validations - something that wasn’t possible with its existing service registry.

“We wanted to have a portal that would be a single point of entry for everyone relevant,” said Ervin Varga, an architect at Libertex who is part of the enterprise architecture team.

“We wanted to address the needs of everyone by having a portal that extends in a way that addresses the pain points of the enterprise as we go on. Port gives us the flexibility of growth while also giving us the ability to implement the features we need right now very easily,” he added.

After evaluating the options on the market, it was clear that alternatives, including open source developer portals and value stream management tools, had one or more of the following constraints:

Too complicated to implement and/or maintain it internally without assistance

It required manual work and didn’t update automatically, requiring useless toil.

  • Too complicated to implement and/or maintain it internally without assistance
  • Too expensive
  • Didn’t allow for an un-opinionated data model
  • Didn’t support plugins extensively (eg. GitLab, AWS)
  • Didn’t allow for a ‘Continuous Improvement’ model (the company wanted to be sure that the data model could evolve over time as the platform and organization evolved). 

It became clear that Port’s Internal Developer Portal didn’t have any of these constraints while offering additional features such as customizable dashboards, scorecards and developer self-service, making it the most reasonable choice.

Varga explained that Port’s community and customer service were other key reasons for its selection.

“Port has a tight connection with its community and our ability to influence Port's roadmap was something we always wished for with other vendors. The prompt responses from Port's engineers and open communication channels during and after the evaluation phase played a significant role in our decision to select Port,” he said.

The importance of an un-opinionated Data Model

Libertex’s previous EAM tool as well as the other developer portals on the market all had the same problem, according to Varga - they were all ‘opinionated’.

“They drive us or force us to use a specific model without any flexibility for changing or shaping them the way we wanted, which was a key limiting factor because we have a complex enterprise,” he said.

There were also issues with data duplication in the EAM’s service registry. For instance, applications and components of an application would have aimed to have kept business owners in sync, but nothing prevented users from overriding them in components, leading to long-term inconsistencies, especially when data was manually updated.

“In Port, we can mirror this property while keeping the source of the truth in one specific part of the data model where it belongs at the system level. This means we can have a business owner and application architect for a component which is mirrored to a component that has a mirror property, so someone can immediately see who the application architect is for a specific component without duplicating data,” said Varga.

To illustrate the importance of the ability to model anything, Varga brings up FinOps. With the EAM tool, there were some elements that they couldn’t model at all within the service registry, such as mapping value streams for FinOps.

Bukarev and his team wanted to allocate infrastructure costs appropriately, by connecting its applications and assets with value streams. For instance, one value stream may be for operating in regulated countries, while another is for operating in another region that doesn’t have the same type of regulations. While the stream may look similar from an architecture perspective, the system or application may be treated differently for each value stream.

“For one value stream, the system or application could be mission critical, and for another it could be business operational or lower.  So, for one value stream it’s expensive to run the application due to its heavy load, and at the same time, another value stream doesn’t need this application at all,” said Bukarev.

“Therefore, we need to connect our assets with value streams and classify relations with external attributes. But this wasn’t possible because we couldn’t extend our data model with simple value streams or product entities, meaning we couldn’t model our organization in the EAM system,” he added.

By using Port, Libertex has been able to accomplish this and more. Recently, it had three parallel running value streams based on the region they’re operating in, which all looked similar. The team then implemented a value stream as an ‘internal product’ which is consumed by the first three value streams.

“We’ve introduced dependencies between different value streams and we’ve created an ecosystem of value streams that we can model with no limitations in Port - something that wouldn’t have been possible with alternatives due to predefined models,” Bukarev said.

As Libertex wants to ensure it’s always ahead of its competitors, its business model evolves every two to four years which has an impact on organizational and engineering structure. This is why the enterprise architecture team wanted a system that didn’t impose any limitations on making changes and being able to extend its data model.

“With the previous tools, we weren’t able to introduce complete Blueprints like you can with Port which would have allowed us to extend the model so that we could incorporate other things, both that are specifically oriented towards developers, and things for other stakeholders such as understanding in what environment something is running, what the current version is of a particular running instance, and what the current situation was with the quality of those components,” said Varga. 

Libertex’s Core Focus Areas For Its Software Catalog

A Real-Time Software Catalog 

Maintaining a software catalog that reflects everything in real time is crucial for Libertex. As the volume of data grows and as changes become more frequent, it becomes impossible for staff to keep the data accurate.

“We need it automated for every ingestion and every update, so it’s mandatory to have a simple API but also some ready-to-use plugins. Otherwise, the data becomes inaccurate in just a few days or weeks and then nobody trusts this data or uses the system that retains the data because everyone assumes it’s out-of-date, so we wanted to fundamentally solve this issue,” said Bukarev.

Port’s software catalog reflects the data model and the software development lifecycle and is continuously updated and enriched. This helps developers answer critical questions such as ‘what is the version of this service in staging Vs production?’, ‘how does it drive ownership and accountability?’, and ‘does it offer a single pane of glass into all services and applications?’

The software catalog will play a crucial role in supporting Libertex’s adoption of InnerSource practices. By centralizing access to internal software tools and projects, the catalog facilitates easier sharing and collaboration across development teams.

Managing Technical Documentation

Technical documentation is currently spread across the company’s systems and applications and Libertex is keen to make documentation easier to find, access, store and use. Port’s software catalog will help to make this a reality. For instance, an API catalog within the software catalog can have useful information about each API endpoint, including its documentation.

After collecting feedback from developers on their pain points and suggestions on handling documentation, the team will use Scorecards to ensure documentation standards are maintained. This will enable the team to see which components or systems are lacking the required documentation and to get an idea of what the state of documentation is across the board. They will rank compliance in tiers - gold, silver and bronze.  Later on, Libertex plans to add additional types of scorecards to drive even more standard compliance.


Using Port, Libertex has already managed to incorporate the display of GitLab Pages as embedded documentation alongside software catalog entities. The raw documentation-related content is kept near the source code and generated from there automatically. This ensures that documentation will live together with source code in the same repository, meaning that it will be maintained properly and provide relevant and accurate information to stakeholders.

AppSec

Previously, if there were to have been a security breach for an application it was difficult to identify which components were using this breached version of an application, and whether they were still running. By mapping vulnerabilities and misconfigurations as part of the software catalog, they can get a clear picture as to whether a vulnerability is in production or not, as well as all the relevant data about the application in one place, and in context.

The Future 

Libertex has already migrated everything from its EAM tool into Port.

“The best testimony of Port's flexibility and power lies in our ability to successfully migrate all data from our previous tool for managing enterprise assets. On top of this, we have also shaped the data on-the-fly into a new form amenable for further evolution,” said Varga.

The focus now is on changing the way the organization deals with technical documentation.

After that, the company wants to focus on: 

  • Hooking up CI/CD pipelines to put things into Port
  • Modeling environments, deployments and running components (to have a dynamic view)
  • Modeling misconfigurations and vulnerabilities
  • Modeling feature flags
  • Self-Service actions

The team have considered some possibilities of using self-service actions that will enable developers to work autonomously, reducing the overall time spent on developers posting tickets and the specialist in question needing to respond to these. 

  • Cost Management

The team wants to be able to track all costs in all of its environments and will look to integrate AWS Cost Exporter at some stage.

Conclusion

Libertex Group is at the beginning of its portal journey but is already finding the benefits of an un-opinionated data model and ability to map its value streams. Bukarev believes that Port will help the enterprise architecture team to pivot whenever the organization requires, establishing the portal as a long-term solution that will keep evolving.

{{cta}}

Book a demo right now to check out Port's developer portal yourself

Book a demo
{{jenkins}}

It's a Trap - Jenkins as Self service UI

Read more
{{gitops}}

How do GitOps affect developer experience?

Read more
{{ebook}}

It's a Trap - Jenkins as Self service UI. Click her to download the eBook

Download eBook
{{cyberark}}

Learning from CyberArk - building an internal developer platform in-house

Read more
{{dropdown}}

Example JSON block

{
  "foo": "bar"
}

Order Domain

{
  "properties": {},
  "relations": {},
  "title": "Orders",
  "identifier": "Orders"
}

Cart System

{
  "properties": {},
  "relations": {
    "domain": "Orders"
  },
  "identifier": "Cart",
  "title": "Cart"
}

Products System

{
  "properties": {},
  "relations": {
    "domain": "Orders"
  },
  "identifier": "Products",
  "title": "Products"
}

Cart Resource

{
  "properties": {
    "type": "postgress"
  },
  "relations": {},
  "icon": "GPU",
  "title": "Cart SQL database",
  "identifier": "cart-sql-sb"
}

Cart API

{
 "identifier": "CartAPI",
 "title": "Cart API",
 "blueprint": "API",
 "properties": {
   "type": "Open API"
 },
 "relations": {
   "provider": "CartService"
 },
 "icon": "Link"
}

Core Kafka Library

{
  "properties": {
    "type": "library"
  },
  "relations": {
    "system": "Cart"
  },
  "title": "Core Kafka Library",
  "identifier": "CoreKafkaLibrary"
}

Core Payment Library

{
  "properties": {
    "type": "library"
  },
  "relations": {
    "system": "Cart"
  },
  "title": "Core Payment Library",
  "identifier": "CorePaymentLibrary"
}

Cart Service JSON

{
 "identifier": "CartService",
 "title": "Cart Service",
 "blueprint": "Component",
 "properties": {
   "type": "service"
 },
 "relations": {
   "system": "Cart",
   "resources": [
     "cart-sql-sb"
   ],
   "consumesApi": [],
   "components": [
     "CorePaymentLibrary",
     "CoreKafkaLibrary"
   ]
 },
 "icon": "Cloud"
}

Products Service JSON

{
  "identifier": "ProductsService",
  "title": "Products Service",
  "blueprint": "Component",
  "properties": {
    "type": "service"
  },
  "relations": {
    "system": "Products",
    "consumesApi": [
      "CartAPI"
    ],
    "components": []
  }
}

Component Blueprint

{
 "identifier": "Component",
 "title": "Component",
 "icon": "Cloud",
 "schema": {
   "properties": {
     "type": {
       "enum": [
         "service",
         "library"
       ],
       "icon": "Docs",
       "type": "string",
       "enumColors": {
         "service": "blue",
         "library": "green"
       }
     }
   },
   "required": []
 },
 "mirrorProperties": {},
 "formulaProperties": {},
 "calculationProperties": {},
 "relations": {
   "system": {
     "target": "System",
     "required": false,
     "many": false
   },
   "resources": {
     "target": "Resource",
     "required": false,
     "many": true
   },
   "consumesApi": {
     "target": "API",
     "required": false,
     "many": true
   },
   "components": {
     "target": "Component",
     "required": false,
     "many": true
   },
   "providesApi": {
     "target": "API",
     "required": false,
     "many": false
   }
 }
}

Resource Blueprint

{
 “identifier”: “Resource”,
 “title”: “Resource”,
 “icon”: “DevopsTool”,
 “schema”: {
   “properties”: {
     “type”: {
       “enum”: [
         “postgress”,
         “kafka-topic”,
         “rabbit-queue”,
         “s3-bucket”
       ],
       “icon”: “Docs”,
       “type”: “string”
     }
   },
   “required”: []
 },
 “mirrorProperties”: {},
 “formulaProperties”: {},
 “calculationProperties”: {},
 “relations”: {}
}

API Blueprint

{
 "identifier": "API",
 "title": "API",
 "icon": "Link",
 "schema": {
   "properties": {
     "type": {
       "type": "string",
       "enum": [
         "Open API",
         "grpc"
       ]
     }
   },
   "required": []
 },
 "mirrorProperties": {},
 "formulaProperties": {},
 "calculationProperties": {},
 "relations": {
   "provider": {
     "target": "Component",
     "required": true,
     "many": false
   }
 }
}

Domain Blueprint

{
 "identifier": "Domain",
 "title": "Domain",
 "icon": "Server",
 "schema": {
   "properties": {},
   "required": []
 },
 "mirrorProperties": {},
 "formulaProperties": {},
 "calculationProperties": {},
 "relations": {}
}

System Blueprint

{
 "identifier": "System",
 "title": "System",
 "icon": "DevopsTool",
 "schema": {
   "properties": {},
   "required": []
 },
 "mirrorProperties": {},
 "formulaProperties": {},
 "calculationProperties": {},
 "relations": {
   "domain": {
     "target": "Domain",
     "required": true,
     "many": false
   }
 }
}
{{tabel-1}}

Microservices SDLC

  • Scaffold a new microservice

  • Deploy (canary or blue-green)

  • Feature flagging

  • Revert

  • Lock deployments

  • Add Secret

  • Force merge pull request (skip tests on crises)

  • Add environment variable to service

  • Add IaC to the service

  • Upgrade package version

Development environments

  • Spin up a developer environment for 5 days

  • ETL mock data to environment

  • Invite developer to the environment

  • Extend TTL by 3 days

Cloud resources

  • Provision a cloud resource

  • Modify a cloud resource

  • Get permissions to access cloud resource

SRE actions

  • Update pod count

  • Update auto-scaling group

  • Execute incident response runbook automation

Data Engineering

  • Add / Remove / Update Column to table

  • Run Airflow DAG

  • Duplicate table

Backoffice

  • Change customer configuration

  • Update customer software version

  • Upgrade - Downgrade plan tier

  • Create - Delete customer

Machine learning actions

  • Train model

  • Pre-process dataset

  • Deploy

  • A/B testing traffic route

  • Revert

  • Spin up remote Jupyter notebook

{{tabel-2}}

Engineering tools

  • Observability

  • Tasks management

  • CI/CD

  • On-Call management

  • Troubleshooting tools

  • DevSecOps

  • Runbooks

Infrastructure

  • Cloud Resources

  • K8S

  • Containers & Serverless

  • IaC

  • Databases

  • Environments

  • Regions

Software and more

  • Microservices

  • Docker Images

  • Docs

  • APIs

  • 3rd parties

  • Runbooks

  • Cron jobs

Check out Port's pre-populated demo and see what it's all about.

Check live demo

No email required

Contact sales for a technical product walkthrough

Let’s start

Open a free Port account. No credit card required

Let’s start

Watch Port live coding videos - setting up an internal developer portal & platform

Let’s start

Check out Port's pre-populated demo and see what it's all about.

(no email required)

Let’s start

Contact sales for a technical product walkthrough

Let’s start

Open a free Port account. No credit card required

Let’s start

Watch Port live coding videos - setting up an internal developer portal & platform

Let’s start

Let us walk you through the platform and catalog the assets of your choice.

I’m ready, let’s start