Year: 2017

14 Nov 2017

Hybris : How to replace base_{locale}.properties approach to fetch localized content for website


In Hybris e-commerce platform, localization of certain content on the web pages is addressed by spring framework class “ReloadableResourceBundleMessageSource” which takes care of fetching the messages from base_{locale}.properties

base_{locale}.properties :

The keys and it’s values are referred in the jsp/tag files using spring framework tag library

Problem Statement or current limitations

The approach of fetching localized content from base_{locale}.properties has certain limitations that can be explained with the below mentioned scenario:

  • A website has launched and a long-awaited campaign is finally live. The site is experiencing heavy traffic with many customers accessing the site across the globe. The content manager suddenly realizes that some content on the site is incorrect and need to be corrected immediately.
    • What are the steps to correct the content on the site?
      • Inform the IT team that manages the site and explain the issue with the content.
      • IT team analyzes the issue and finds the root cause.
      • IT team informs the content manager that the content value in base_{locale}.properties needs to be corrected. It would take a few minutes to make this minor change and servers would need to be restarted to reflect the updated content.
      • Servers restarted after the changes are made in base_{locale}.properties and the updated content shows up on the site.

Instead of every such minor content change going through the above cycle:

  • What if such a change could be made without any server restart?
  • What if a content manager is given the ability to update content without relying on the IT team and have it reflect on the site immediately?

How amazing it would be if a content manager has the power to make changes to content just as he/she manages content in WCMS (Web Content Management System)!


Here’s an approach that will empower a content manager to make changes to content that is a part of base_{locale}.properties without having to chase an IT team to get a fix in place, without having to restart servers and still have the change reflect immediately on the site

  1. Create an item type which stores all the key=value that is defined in base_{locale}.properties.
    • The item type can be made catalog aware to ensure that the content value that is changed can be previewed before allowing the change to reflect on the site.
    • The value qualifier in the item type can be localized to ensure that this approach takes care of storing the values of locales that the website supports.
  2. Override the ReloadableResourceBundleMessageSource class of Spring framework which fetches the values from base_{locale}.properties with the new class  CustomBundleMessageSource. This takes care of fetching the value from the database.
  3. Make the newly created item type, that stores the key and values that are usually created in base_{locale}.properties, available in Back Office such that content managers can login to Back Office and make the necessary changes without relying on an IT team to make content corrections.

Performance considerations

As the proposed solution indicates that an item type is created to replace base_{locale}.properties for fetching the localized content. This implies that every request to fetch content will result in a call to the database, giving rise to performance concerns. To avoid making a database hit for every request, load the data from every query to the item type in the cache thereby any request for the same message content will be fetched from the cache and not from the database. This will prevent any subsequent database calls being made to fetch the content that’s already in cache.

To achieve this, introduce a class, say “BaseMessagesCacheRegion“, which extends “EHCacheRegion“. This will hold all the values that are returned by the queries to fetch the content messages from the database.

This leads to another point of consideration – when a content manager updates the content from Back Office, the changes will not reflect on the website immediately. The reason being that that particular value is already in the cache. When the system identifies the content’s key and its value in the cache, it will not query the database to fetch the value that has been updated by the content manager. To display the updated content on the website, one must login to HAC and clear the entire cache.

This implies that the cached data for the entire content is cleared and not just the one for which the content manager has made an update via Back Office. To address this, introduce an interceptor which will take care of updating the cache with recently updated content in Back Office. This will eliminate the need of clearing the entire cache from HAC.

Note: There are many ways to address updating of cached content; what has been used here is the approach of an interceptor to address the update of cached content.

References of a working example:

  • Snapshot of Back Office displaying “BaseMessages” item type created to load the base_{locale}.properties data.

  • “contentBaseMessageCacheRegion” that’s introduced to store the cached content of BaseMessages.

12 Jun 2017
IoT Generated Data

IoT Generated Data

Shoes telling us that we’re getting slower running our usual route, refrigerators ordering milk automatically and thermostats resetting themselves are applications that are poised to revolutionize the lives of consumers and provide an incredible opportunity for retailers and service providers to serve and build long lasting relationships with them. The key currency of this relationship is data and therein lies one of IoT’s biggest challenge today.

For starters, there is this security concern. A typical house of the future will have 100 to 150 IoT enabled devices connected to their home network up from 5-10 a couple of years ago. Internet enabled devices in a house which were limited to mobile phones, TVs, laptops and an occasional thermostat will in the future be supplemented by 1 or 2 refrigerators, multiple wearable devices (fitness tracker to glasses to watches), shoes, undergarments (think Under Armour’s IoT play), consoles (the play station & Wiis of the world), light bulbs, home security systems, HVAC, sprinkler system, cooking appliances, home theater system, etc. Suddenly, 100 to 150 access points now need to be secured and so does the data being generated and sent across. Apart from the mundane “how many miles did you run today” and that “your light bulb needs a change”, sensitive medical, financial and other private data is also flowing back and forth which of course needs to be secure. And, the complexities only increase at business establishments where the number of IoT enabled devices can run into thousands.

Security at the providers’ end is also a challenge. With all the data coming in, providers also need to reevaluate their security infrastructure, policies and systems. The CISO (Chief Information Security Officer) is today already a harassed individual. The enterprise of the future will have significantly much more data coming from an ever-increasing number of connected devices and the points of breach will continue to challenge the CISOs.

Second, the challenge around network bandwidth and performance. Whether this is home Wi-Fi or the service providers network, the sheer volume of data being generated and consumed is going to explode – exponentially. For example, it is generally understood that Tesla collected 1.3 Billion miles’ worth of data in from autopilot equipped vehicles, in just 2 years. That’s a lot of data! Availability and access to data usually ends up creating demand for more data and that will eventually put a tremendous load on the infrastructure. At all levels – global, service provider, enterprise & home, are we prepared to address bandwidth and performance issues. Whether we like it or not, this will need to be addressed.

Lastly but not the least, is what to do with the data – or the action being taken on the data. Wearable devices (with authentication and permissions) may be generating data for the retailer as to how consumers are shopping in their physical store. But, are they prepared to analyze this data and get some actionable information out of the same? I think not. Or, refrigerators sending orders for replenishing milk might be coming through in thousands, but is the manufacturer prepared to understand what this data means for them? I believe we will eventually get there but right now, we are just not prepared to make sense of the volume of data that’s expected to be generated other than the usual demographic and transactional analysis. It is however, around deeper analysis that we will see the real value. They say data is king. Certainly! And, only if the king delivers.

The solution (or solutions) for the above challenges will eventually be found. Either technology or process (or both) will handle this. Indeed, every day there’s a new announcement regarding technology or a new approach to systems and/or processes (or transformations that an enterprise needs to make). So, it is a matter of time before these get addressed adequately. Till then, whether you are a consumer, an enterprise, a service provider or simply an enthusiast, do spend some time thinking through the data challenges while planning your IoT initiatives.

09 Mar 2017

Pragiti Receives SAP® Hybris® 2017 Partner of the Year Award for Service Delivery – Americas

Pragiti today announced it is the recipient of the SAP® Hybris® 2017 Partner of the Year Award for Service Delivery, Americas. Awards were presented by SAP Hybris to the top-performing partners in 2016 that have made outstanding contributions to driving customers’ digital transformation. Recipients of this year’s awards have been – in partnership with SAP Hybris – helping organizations adopt innovation easily to attract, retain and grow a profitable customer base.

The award recognizes Pragiti for delivering numerous customer successes in 2016 across different industries, geographies, channels and solution types.

“In a channel-agnostic world, Pragiti works closely with SAP Hybris in providing organizations with a consistent, compelling and contextual experience for their customers – for their online and offline journey. For our innovative work, project successes and customer delight to be recognized and appreciated by SAP is highly motivating to us and the hundreds of Pragitians who work diligently to make our clients win in the digital world,” said Praveen Pahwa, Founder & CEO at Pragiti.

Selected from SAP Hybris’ broad partner ecosystem, nominations for the SAP Hybris Partner of the Year Awards were based on internal SAP data. A steering committee composed of regional and global SAP Hybris representatives determined winning partners in each category according to a number of criteria including joint sales achievement, customer case studies and consultant certifications. Awards were presented in a variety of categories, including overall partner performance by geography, innovation, service delivery and newcomer.

A Gold-level partner since 2011, Pragiti has been delivering digital disruption solutions – in partnership with SAP Hybris – for global brands that power everyday lives. These include cosmetics, active lifestyle clothes, sunglasses, prescription lenses, scientific instruments, medical equipment, farming products, education aids or steel processing.

Pragiti received its award during the SAP Hybris LIVE: Digital Summit conference, an annual gathering bringing together thousands of professionals from across the globe who are involved in the customer journey, ranging from marketers and developers to senior executives. The one-day event contains both digital and physical elements with streamed and recorded sessions from three locations: Singapore, Munich and New York. The event focuses on the overarching theme of “Go Beyond Disruption.”

About Pragiti
Pragiti, Inc. is a leading eCommerce solutions and services company focused exclusively on providing channel-agnostic solutions based on SAP Hybris to its global clients. As an SAP Hybris Gold partner, Hybris Market Extend partner and SAP Hybris product development partner, Pragiti is the eCommerce partner for several well-known brands across multiple industries including auto, cosmetics, eye care, fitness, home improvement, manufacturing & distribution, medical equipment, specialty apparel, etc.

SAP, Hybris and SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP SE (or an SAP affiliate company) in Germany and other countries. See for additional trademark information and notices.

Source –

06 Feb 2017

SOAP/REST message monitoring in Eclipse and IntelliJ

Webservice applications ,either SOAP or REST send a lot of information in the headers along with the actual message.Most of the information is not captured in the application logs.

For debugging web service client/server applications , it is necessary to see the exact message that is sent on the wire i.e HTTP message along with the HTTP headers. Application logs usually print only the webservice request and printing the HTTP headers needs additional configuration based on the specific library being used.

SOAP UI- sample Raw HTTP message

Eclipse and IntelliJ provide plugins which help to intercept the HTTP messages sent between the client and server by acting as Man in the Middle.This is a very useful feature to debug the HTTP requests generated by application code and is very easy to setup.

Client/Server — Normal Communication

Client/Server — Communication with TCP Monitor

Eclipse Plugin — TCP/IP Monitor

TCP/IP Monitor -Eclipse

IntelliJ Plugin — TunneliJ

TunnelliJ HTTP Monitor

For advanced debugging and monitoring of TCP traffic, Wireshark can be used. I will try to cover this in a separate post.

23 Jan 2017

Using Stagemonitor with hybris

Stagemonitor is an opensource solution for performance monitoring of Java applications.It provides insights about the call stack ,Method execution time, page load time,JVM,JDBC, Request metrics and helps to better understand and improve the performance of applications.

Stagemonitor can be used for both development and production environments.It imposes a very low overhead on the application.

Stagemonitor for Development

hybris storefront with Stagemonitor widget

For development environments, Stagemonitor provides a widget which is injected into the webpage which is being monitored.This widget gives the details about the Call stack, time spent in each method,Web Requests and JVM metrics.

Stagemonitor widget-Call Tree tab

Call Tree tab provides the call stack and the time spent in each method.

Stagemonitor widget-Request tab

Request tab shows the total Page load time and also shows time taken for network,Server processing and DOM Processing.

Stagemonitor widget-Metrics tab

Metrics tab provides the details about the JDBC, JVM and Web Requests.


Stagemonitor can be configured to send the performance metrics to time series databases like ElasticSearch, Grafana.This allows the flexibility to monitor requests/metrics over a period of time and helps in understanding the application performance issues.

Multiple application instances running on different hosts can be monitored at a time using Stagemonitor.

Elasticsearch — JVM Metrics

Integrating with hybris

1. Download and install the below extension and add entry in the localextensions.xml . This extension contains all the dependencies and properties for running stagemonitor.

Stagemonitor hybris extension

2. The file at the below path is where the configuration can be controlled.

${HYBRIS_BIN_DIR}/bin/custom/stagemonitor/resources/, de.hybris = true

stagemonitor.applicationName=Electronics storefront
stagemonitor.instanceName=Electronics storefront


stagemonitor.instrument.jdbc.dataSource.implementations=com.mysql.jdbc.jdbc2.optional.MysqlDataSource, org.apache.tomcat.jdbc.pool.DataSource, org.apache.tomcat.dbcp.dbcp.PoolingDataSource, org.apache.tomcat.jdbc.pool.DataSourceProxy, org.apache.commons.dbcp2.PoolingDataSource, org.apache.commons.dbcp.PoolingDataSource, org.springframework.jdbc.datasource.AbstractDriverBasedDataSource, org.hsqldb.jdbc.jdbcDataSource, org.apache.commons.dbcp.BasicDataSource, de.hybris.platform.jdbcwrapper.HybrisDataSource

#stagemonitor.elasticsearch.url= http://localhost:9200

3. The tomcat general options property needs to be modified to include the below java agent.


4. Once hybris is restarted after the above changes, stagemonitor starts collecting the metrics and depending on the configuration, either the widget can be used from the browser or the the metrics can be sent to the Elasticsearch server for further analysis.