Go Multi-National with Pega


Hi, I’m  Anna Buchanan and I am currently one half of QA Consulting’s Pega Technical Leads team. My role surrounds the growing and supporting of their Pega offering, as well as supporting Pega consultants in the field. In addition to this, I am also based on site as a Certified Senior System Architect and I am currently studying for my Lead System Architect qualification.Over the course of the coming months I will be updating you on the latest in Pega technology. Today’s topic is localisation.

Imagine you are working on a banking application for a client with offices all over the world. You may be working on an English version of this application, but it could be easier than you think to introduce the software to other countries. In this blog I’ll be talking about Pega’s localisation and translation abilities, to better inform you, and hopefully give you a chance to use it.

What is meant by “Localisation” ?

Localisation, in the case of out of the box Pega capabilities, means a few things: translation of text, time zones and incorporating local conventions. The translation of text is the most obvious one; this is simply putting the application into a new language, and we will talk more about this later. Setting a new timezone means making sure the calendar and any other time specific processing, is up to date with the local time. Local conventions is a bit more nuanced, this includes considerations on how are things presented in a locale, how do we sort in a language that doesn’t use the Latin alphabet, etc.

The Nitty Gritty

So what are the basics of implementing this? For the full details you should see Pega’s LSA course but I will go over some of the high level details here. For deciding which locale you want to use, you can set Pega to look at the local machine settings, the browser setting or the application setting. Which you use will depend on your use case, and it is possible to switch between locales in one session if needed.

There is a localisation wizard that will help guide you through a lot of the process for creating a new translation. You can incorporate a Pega provided language pack or provide your own, and what is produced is a translation XML file that you can open in Excel and tweak as required. You re-import this into Pega, and that gives your application its translation capabilities.

Utilising Language Packs and Making Your Own

Pega offers language packs for many languages, and these will translate out of the box UI around the case manager and case worker portals. It is widely recommended to build your applications using out of the box rules as far as possible rather than making your own, as the more your application has followed this convention in regards to UI, the more you will get out of a Pega language pack. For all your personalised UI, you will need to create your own language pack, which the wizard can help you to do.

What about Architecture?

Finally, how does this fit into your wider organisation architecture? Largely it depends on the scope of your translations. Let’s say in our organisation’s architecture, only one application is expected to go multi-national. It would make sense to build all of the localisation on top of the English version, for which you would need a separate Application/Ruleset for each language. But If your company is likely to need multi-national support across the organisation is makes more sense to create translation rulesets at each level of the architecture – one for your Organisation ruleset, one for each division etc. You can then utilise these rulesets in all applications for that language.

In conclusion, there are a lot of capabilities in Pega to help you implement applications in multiple languages. The tool is meant to give you as much as possible out of the box, as that is where its real value is, and the same goes for translation. The more you can get out of the translation packages the better, as translating custom UI strings is largely manual. Apart from the translation aspect, timezone features are relatively hassle free to implement, and having local conventions catered for greatly reduces our workload. Using all of this, we can consider making our applications multi-national and therefore getting the most out of them. This is a great option to have as many Pega clients are global vendors.

Microservices in the workplace


Hi, my name is Gareth Andrews and I am a Consultant at QA Consulting. I am currently deployed on client site as a developer, and an occasional Scrum Master when my services are required. Alongside this I am also training for a qualification in Information Security Management Principles. My current role as a developer has an emphasis on the production of Microservices, which brings me to the topic of this blog.

So what are these Microservices and why are companies, who can be gigantic entities with hundreds of employees working on software, looking to include them into their systems?

“Microservices” is a term a lot of companies use, but the description that is widely accepted is ‘where complex applications are comprised of multiple independent processes’.

The typical fall-down for many companies is that they keep expanding their systems, offering more and more without considering just what would happen if someone fell down. Imagine a large tower with many floors, what would happen if you suddenly decided that you wanted to move the bedroom from the 1st floor to the 5th? With Microservices, your architecture allows you to move, replace and update as you go along, with support available for both new and old features enabling you to create a stable and adaptable framework.

Companies like Netflix and Amazon use Microservices in order to help scale up their products, with new features and applications simply requiring you to add rather than adjust.

The point of Microservices is to be able to communicate between one another, with this varying depending on how you want the systems to work. What better way to communicate though than with HTTP web calls, the same way you would go about loading up Youtube or Facebook after a long day at work?

RESTful web services are written with the internet at heart. These services allow you to navigate to an address and get responses – much like you would when you search for that ticket website in a rush. Allowing access to Microservices through calls to addresses means that all of the Microservices are accessible to one another, with none having to actually know about any changes except their web address.

I think this quote by Richard Branson is a great place for me to finish “Complexity is your enemy. Any fool can make something complicated. It is hard to make something simple”.

QA Consulting sponsors ServiceNow Knowledge16


We are proud to announce that QA Consulting will be sponsoring ServiceNow’s Knowledge16 Conference in Las Vegas, USA. This year’s event will take place at the Mandalay Bay Hotel on the 15th-22nd of May.

Knowledge is now in its tenth year and is the world’s largest gathering of service management professionals and is the go-to destination for anyone who is ready to transform their enterprise by managing everything as a service.

This years conference has more than 200 breakout sessions, 120 hands-on labs, pre-conference training, multiple general sessions led by industry visionaries, as well as hundred of ground breaking solutions to help enable you to operate faster and become more scalable than ever before.

If you are attending Knowledge16 in Las Vegas please contact consulting@qa.com and arrange a meeting with one of our team or alternatively stop by our stand for a chat.

QA Consulting Sponsors MuleSoft Summit London


We are excited to announce that QA Consulting are sponsoring the 2016 MuleSoft Summit in London. This year’s event will take place at the Hilton London Metropole Hotel on the 5th of May.

The MuleSoft Summit is the perfect place for you to understand first-hand how you can leverage MuleSofts Anypoint Platform to deliver engaging customer, partner and employee experiences like never before.

If you are attending the MuleSoft Summit in London please contact consulting@qa.com and arrange a meeting with one of our team or alternatively stop by our stand for a chat on how we can help you to connect anything, change everything.

QA Consulting Sponsor Adobe Summit EMEA

QA_C_Summit_EMEA 2016_banner-01

We are excited to announce that QA Consulting are co-sponsoring the 2016 Adobe Summit EMEA with NETbuilder Digital. This year’s event will take place at the ICC ExCeL in London on May 11-12th.

With over 3,500 of the world’s digital marketing leaders attending the 2016 EMEA Summit it is the largest of its kind in Europe. There is a great mixture of keynote speakers including Colin Farrell, Heston Blumenthal, Davina, McCall, as well as speakers from Spotify, Unilever and BMW this year is sure to be another great success.

The Adobe Summit is the perfect place for you to understand how to reach your customers, find out how you can personalise their experiences to make your company the first port of call. Attend sessions from marketing innovators and explore the latest tools and trends in the digital worlds, whilst understanding how other companies are using the Adobe marketing Cloud and fain gain insights on how they are creating more personalised campaigns to better reach their customers.

As a joint sponsor with NETbuilder Digital, we are in a prime position to better service our customers with their digital needs through our new Digital Centre of Excellence Enablement Programme, allowing customers to accelerate their investment, reduce operational cost, and maximise ROI of their Digital initiatives.

If you are attending the Adobe Summit in London please contact consulting@qa.com and arrange a meeting with one of our team or alternatively stop by our stand for a chat on how we can help assist in your digital strategy.

QA Consulting sign partnership deal with Hortonworks


QA Consulting are delighted to announce our new strategic partnership with Big Data specialist, Hortonworks.

QA Consulting carefully select which technology vendors we engage with as partners. Hortonworks’ growing presence in the Big Data arena makes them the ideal addition to our roster of vendors we are proud to partner with.

Hortonworks are the leader in emerging Open Enterprise Hadoop and develops, distributes and supports the 100% open source Apache Hadoop data platform. Their data platform provides an open platform that deeply integrates with existing IT investments and upon which enterprises can build and deploy Hadoop-based applications.

Our partnership with Hortonworks will allow our customers the ability to benefit from a centralised architecture for running batch, interactive and real-time applications simultaneously across a shared dataset. This new partnership will complement our existing Big Data service offering, allowing QA Consulting to initiate and drive transformational organisation data and unlock its potential by enhancing our customer’s business intelligence.

Tony Lysak, Managing Director, QA Consulting commented “In today’s world information is now generated on a scale previously unquantifiable. The need to analyse and interrogate this data and create meaningful information to better support customer’s organisations has never been more important. Our partnership with Hortonworks, a leader in Enterprise Hadoop, will enable our customers to take advantage of our Centre of Excellence Enablement Programme, supplying expert Data Consultants to provide genuine scalability and the advantage required to stay ahead of their competition regardless of company and sector”.

QA Consulting are excited to start our partnership journey with Hortonworks and help deliver on their mission to establish Hadoop as the foundational technology of the modern enterprise data architecture.

Life on Site with Gareth Andrews


One of the advantages of being a consultant is that the future is open to you. I am currently working towards getting a certification in Information Security Management, a certification that would better equip me to deal with the everyday risks and issues of security in a digital world.

Training and expanding on your skills is a must these days, where jobs and opportunities are difficult to come by. There are many training courses available and talks held to help you learn and understand everything from techniques to new technologies, there is not a single week where I feel like I’ve not developed in some way.

With every moment being a chance to develop, it’s hard to not see something great coming your way.

So that’s me, past, present and future. I hope that these blogs have been useful in understanding just what we do as consultants for QA Consulting, from the fast paced rush of fixing issues to the calm and collected analysis required to plan ahead for the next piece of work weeks before you even start creating the code.

QA Consulting has given me the opportunity to put my foot in the door for a very competitive market with all the support I need to work for a high end business straight out of the Academy. Going forward I hope to train and develop my skills in information security so that our clients can rest easy knowing that with my  skills and expertise I can help to protect their data.


Mohammed Shareef excelling on site


This week we would like to congratulate Mohammed Shareef this week on his outstanding work on client site.

Mohammed completed a Bachelors in Computer Science Engineering from JNT University (India) and a Masters in Information Assurance from London South Bank University.

Since joining QA Consulting back in 2011, Mohammed has been trained and certified on middleware technologies (Oracle SOA/ BPM/ OSB and WSO2). Since completing The Academy he has been working with some of the blue chip companies like Lockheed Martin, Fujitsu and CapGemini, working mainly on large integration projects and having worked from concept to delivery. Mohammed has gained a very good understanding of the complete SDLC. Mohammed states ‘The best thing I like about QA Consulting is the platform it sets for individuals, where you are trained in niche, high in demand technologies and then you are given the opportunity to work with blue chip companies and excelling you career in IT’.

Since being onsite Mohammed has been helping to implement a new case management system and evaluating rules/decision technology that could be suitable.

WSO2 Business Rules Server and JBoss Drools were technologies of interest, and Mohammed built two prototype solutions delivering functionality utilizing WSO2 Business Rules Server and Drools, delivered source code and required configuration files. Alongside this he did face-to-face knowledge transfer to the client, also producing two separate reports for WSO2 and Drools, which included, how we met the acceptance criteria and if they didn’t, what were the challenges and also suggested alternatives.

Our client stated, “Mohammed had a professional attitude and high quality work, to which the team were very pleased with his output”. Well done Mohammed on your accomplishments on site and keep up the good work!

If you are interested in a career with QA Consulting have a look at our Academy Website for more information on how you can build your future in tech.

Going faster with ChatOps

David_Wilcox_Tech_Bytes-05It has become widely recognised that adopting a DevOps culture leads to more stable, regular releases and increases collaboration between all those involved in the software delivery process. However there is often still much room for improvement in increasing collaboration, closing feedback loops, and reducing friction in processes.

What is ChatOps?

ChatOps is about taking the age-old and natural process of people collaborating and chatting, making this available for everyone to partake in, and then injecting the tools and automation to achieve tasks into the chat.

The great thing for those teams who are adopters of DevOps practices is that they will already likely have a high level of automation in many of their processes such as: testing, deploying, provisioning, and resolving infrastructure issues. Integrating these to the chat means they are driven in a collaborative, visible manner, with feedback given to all.

In short ChatOps is about aggregating information and triggers; making the discussions open; and the actions taken democratic and visible.

How is it done?

The most common pattern is to use a bot such as GitHub’s Hubot, Err, or Lita. These are integrated with real-time messaging tools like Slack, Atlassian’s HipChat or IRC. Hubot adaptors are available for almost any widely used chat tool.

In Hubot’s case, scripts are written which control the bot’s behaviour, and how it responds to messages in the chat. This allows actions to be triggered as a result of users interacting with the bot in the chat room. For example “@bot deploy latest version of app to UAT” could be configured to deploy the latest version of the code to the UAT environment. The possibilities are endless; if it is not possible, or sensible, to implement the action within the Hubot scripts, they can be configured to make HTTP calls to external services, which can drive more complicated actions. For example integrations can be made with build servers, CI pipelines and configuration management tools.

Hubot can also be configured to provide HTTP endpoints for external services to integrate with – Hubot can therefore keep the chat updated with external events. This can facilitate a great workflow where the bot alerts the chat to issues received from monitoring of the application or infrastructure; users discuss and undertake actions in the chat to fix the issue; and positive feedback is received in the chat when the issue is fixed.

Why should you do it?

It will open up processes and practices for all to see, empowering your team members and spreading the knowledge.

  • Reduce the silos, tighten feedback loops and encourage collaboration.
  • ChatOps will encourage further automation of the underlying tasks.

Life on Site with Gareth Andrews


So you’ve learnt about my past and my time at the academy, I think it’s now time for you to know about what I do on client site. I am currently working on a team focused towards delivering Microservices. Here’s a day in the life of me:

You start the day nice and early, checking all those work emails you receive a minute after you turn off your computer, and then you start by checking the JIRA board. Typically each day we assign ourselves a task and then get it completed by the end of the day, meaning each day we see progress both in the project and across the JIRA board.

Once you’ve started working away at your daily task you have a stand-up around half nine in the morning. In the next 15 minutes you hear from everyone in the team what they did the day before, any problems they had and what they are looking to do today, it is also the time when you learn about wide reaching factors that impact the project, such as changes to requirements or people being off on holiday for the next week. This means that once you’ve left the stand-up you are in the know of everything that should be happening that day.

So, back to the computer for an hour or so.

Seems like the team’s got a new piece of work coming up so it’s time for an impacting session. After reading through a document you’ve received a few days ago our product owner gives us a run-down for the reasons behind the new piece of work. It could be an extension of current work, or something completely different.

Everybody gets a chance to discuss their area of the project; for example what kind of code will be required, be it changing previous code or creating something entirely new. This impacting session will give the team an estimate into how much time it would take for the team to complete the project and its “impact” on the team.

12:00, time to go get some food for lunch. Remembering it’s Friday you head out the door instead of going to the canteen, laughter and chat following you as your team and friends grab a bite to eat for lunch. With full stomachs you return to work, mumbling about wanting to go back for dessert normally before returning to your task for the day.

Things don’t always go smoothly, and suddenly on your JIRA board there is an issue that’s affecting live. A bug has made it through the code!

Step one: investigate the bug. I run the code with the same data to try replicate the problem. At this point speed is of the essence, but you can’t make any mistakes either. With the problem repeated you notice that the error message is something you’ve seen before, and quickly find it in the code.

With a quick note on the JIRA issue and a phone call, you’ve already pushed the code through the continuous delivery environments, with all the automated tests running and checking that your fix won’t open up a Pandora’s Box of problems.

A few minutes later you receive a message saying that it’s been deployed to live and they are running the file again, with a smiley face and a thankful phone call confirming that the error has been fixed.

With that all done it’s time to return to your work for the day, finishing it just in time as you log your hours and move the task on the board to show everyone your progress for the day. Shutting down the machine and making sure everything is locked away, ready to repeat the process again tomorrow.

Join me next time when I will be discussing what comes next.