Following the establishment of the partnership between Pontem and Seeq, we been diving into the technical elements of the platform and exploring the myriad use cases for this great combined capability. Aside from individual applications Pontem have recently attended the Seeq Roadshow in Houston and have undergone intensive training from the Seeq team to uncover the power of the platform for us to unleash.
A month mucking around with Seeq
I’ve spent the past month in the Seeq sandbox, experimenting and training to be a Seeq Certified Analytics Engineer and I’ve come away highly impressed. I will demonstrate some of the features that stood out below:
Ease of use and functional flexibility:
Seeq Workbench is Seeq’s no-code/low-code analysis platform with a powerful calculation engine. This empowers process engineers without a programming background enabling them to …
…clean signals…
Denoising an input signal
…conduct statistical analysis….
A simple statistical analysis, averaging product viscosity and density according to product grade over ~1 - Created with just a few clicks!
….develop simple prediction models…
Comparing a regression model against field data
…and so much more!
Including forecasting trends and setting up conditional notifications and alerts.
Workbench is also built to work with asset structures and hierarchies and one nifty feature I appreciated was the ability to swap in assets to the analysis you are doing, with just the click of a button. This makes an analyst’s job way easier and you don’t have to copy-paste code poorly written code.
There are many more analysis features which the Seeq team have thought about and provided access to within Workbench which doesn’t require you to write any code. But in case your analysis requires functionality outside of what’s readily available, Workbench also allows you to create complicated functions within their Formula tool (which by my understanding is the back-bone of many of the preset functionalities)
The library of functions available to use within Formula is also vast with plenty of documentation. Although one feedback I have for the Seeq team with Formula is to make the error messages more descriptive when you have syntax violations.
Integration with Python:
Seeq Data Lab is a JupyterLab environment within Seeq which lets you query your data source for specific types of data, do calculations/modifications on the data and push it to Workbench for visualization. The key to this is the SPy python module which enables Seeq functionality within a Python script/Jupyter notebook.
Data Lab is highly nifty for cases when you have trained a complicated model like a neural network. Within the Jupyter notebook, you can load your model, pull in feature data, conduct feature engineering, generate predictions and then push it back to Workbench. I can easily imagine a scenario wherein your data scientists use the Data Lab to generate predictions, schedule the predictions to be generated and pushed at a certain frequency, the operators see these predictions in Workbench and take necessary actions.
The SPy module seems packed with functionality and I am keen to explore this more in the coming weeks!
Data Connection Flexibility:
Seeq supports an array of data connectors from AVEVA Data Hub to Canary Labs Historian to Cognite to OSISoft PI to you name it. This is really important for us at Pontem and it offers us the flexibility to deploy Seeq in multiple client system types.
Closing Thoughts:
While I have given a rundown of the features I found most impressive in my thusfar limited time with the platform, I would like to now conclude with some thoughts about how our partnership with Seeq is important.
At Pontem, one of our key strengths is the ability to derive insights from process data, create, deploy and maintain models, while ensuring transparency and accountability.
Seeq fits into this puzzle by letting us seamlessly integrate our machine learning workflows into OT systems and letting operators/other stakeholders in the business network easily monitor performance. The sheer variety of analysis tools at your disposal also ensures that clients can easily modify/customize dashboards without raising support tickets or being locked behind a front-end with limited customizability.
Seeq Roadshow Houston
As part of a series of local user group meetings we attended the Seeq Roadshow in Houston earlier this week. With the aim of showcasing the latest advancements in the Seeq platform and hearing from local operators who have used the system in novel ways, this was a great opportunity to further deepen the partnership.
Lisa Graham kicked the event off with the high level strategy for the software and company with the evolution of the platform from where they started as a self service data analytics engineering tool now to a full cloud based enterprise platform. A few of the senior staff went over some of the recent developments including the increased extensibility of the platform to enable wider operational excellence and full enterprise monitoring. Some other key highlights of new capabilities which continue to evolve are:
Creation of custom scripted tools which can be called as part of normal analysis
Exapanded notification functionality based on process conditions and advanced logic
Increased ease of scaling across industrial and enterprise systems
Expansion of dashboard configuration functionality, tabular reporting and charting functionalities
Better options for 3rd party data sharing
The team also demonstrated the power of their new Seeq AI Assistant which utilizes generative AI and LLMs to provide an intuitive way of interacting with the local data and analysis. The system has been trained on the existing Seeq learning materials to provide guidance on how to achieve certain analysis outcomes and interpret the results to provide unique insights. This is still very much a work in progress but has tremendous potential as a tool for managers and engineers to rapidly navigate through the data to get to the crux or root cause of an issue.
Several customers provided detailed use cases for the application of Seeq in their organisations including a refiner utilizing the platform for emissions monitoring and reporting to a chemical manufacturer using the data analytics and anomaly detection capabilities of Seeq to provide advanced warning of potential distillation column fouling. These examples demonstrated the tangible value realized from extended analysis of facility data and the need to combine this analysis with specific domain knowledge.
Finally a panel session rounded out the afternoon with unique viewpoints from panelists from the cement, chemical manufacturing and enterprise IT industries. This was an excellent session with unique viewpoints for the intersection of historically conservative industries with the emerging technologies that are being used to reshape how work is performed. A key element discussed was the need for change management in how experienced worker knowledge is transferred, how to embrace new technologies and the need to focus on people centric use cases.
It was great to catch up with the Seeq team and a number of clients and partners to discuss how we're all putting the platform to use. One thing that really stood out to me after going through the onboarding for Seeq is that there still remains so much more potential in the platform to solve some of the big problems experienced by industrial operations. Like so many systems out there we often only use 10% of the capability… but at least we now have the key to unlock that potential!
Please reach out to info@pontemanalytics.com to discuss how we can help solve your business’ most important problems!