Tuesday 1 December 2015

Quick Guide to Internet of Things



The “Internet of things” (IoT) has been a big topic of conversation in the workplace for some time, but considering how it can change how we both live and work, it’s no surprise that it continues to make headlines. Broadband Internet has become more widely available, the cost of connecting is decreasing, more devices are being created with Wi-Fi capabilities and sensors built into them, technology costs are going down, and smart phone penetration is sky-rocketing.  All of these things are creating a “perfect storm” for the IoT. This short article addresses what exactly it is, what impact it will have on you, what development trends are we seeing in this field and what can other areas of software development learn from it.

What exactly is the IoT?

The concept in its simplest form is that basically involves connecting any device with an on and off switch to the Internet (and/or to each other). This includes everything from cell phones, coffee makers, washing machines, headphones, lamps, wearable devices and almost anything else you can think of.  This also applies to components of machines, for example a jet engine of an airplane or the drill of an oil rig.  As I mentioned, if it has an on and off switch then chances are it can be a part of the IoT.  The analyst firm Gartner says that by 2020 there will be over 26 billion connected devices…that’s a lot of connections (some even estimate this number to be much higher, over 100 billion).  The IoT is a giant network of connected “things” (which also includes people).  The relationship will be between people-people, people-things, and things-things.

How does this impact you?

The new rule for the future is going to be, “anything that can be connected, will be connected.”  But why on earth would you want so many connected devices talking to each other?  There are many examples for what this might look like or what the potential value might be.  Say for example you are on your way to a meeting, your car could have access to your calendar and already know the best route to take, if the traffic is heavy your car might send a text to the other party notifying them that you will be late.  What if your alarm clock wakes up you at 6 am and then notifies your coffee maker to start brewing coffee for you? What if your office equipment knew when it was running low on supplies and automatically re-ordered more?  What if the wearable device you used in the workplace could tell you when and where you were most active and productive and shared that information with other devices that you used while working?

On a broader scale the IoT can be applied to things like transportation networks “smart cities” which can help us reduce waste and improve efficiency for things such as energy use; this helping us understand and improve how we work and live.  It does sound great, well parts of it do, the main concern is the security threat implication. Will someone be able to hack into your toaster and thereby get access to your entire network?  The IoT also opens up companies all over the world to more security threats.  Then we have the issue of privacy and data sharing.  That is and always will be a hot topic, so one can only imagine how the conversation and concerns will escalate when we are talking about many billions of devices being connected.  Another issue that many companies specifically are going to be faced with is around the massive amounts data that all of these devices are going to produce.  Companies need to figure out a way to store, track, analyse, and make sense of the vast amounts of data that will be generated.

 

Development Trends

Let’s start off with a few facts and insights from the Evans Data Corporation Internet of Things Development Study 2015. A third of all IoT developers are primarily focusing on Big Data and analytics projects, with 20.6% primarily focusing on firmware or preloaded software for the client device. Middleware (20%), which is essential for enterprise-wide adoption of IoT strategies, along with backend/server development (17.2%) taken together comprise the majority of development efforts.
21.7% of all IoT developers surveyed are working on ecommerce-related projects today. Business-to-Consumer (B2C) has a slight edge regarding the number of developers at 11.7% versus those developers working on Business-to-Business (B2B) projects (10%). Developers are also concentrating on supply chain-related projects including logistics (7.4%) and transportation (7%). The following graphic provides a ranking of connected device projects by percent of developers currently working in these areas.
42% of IoT developers are currently writing software that uses sensors. IoT developers are most often supporting acceleration and vibration sensors while creating new apps, followed by electric/magnetic and flow-based sensor devices. The study makes the point that acceleration or vibration sensors can refer to the sensors that change the orientation of a touch screen based on the way in which the device is held, and can measure the stability of stationary objects or objects with moving parts.
23% of IoT developers are currently working with or incorporating in-memory databases into their development work, and 44% plan to in the next six months.

Parting Thoughts

As everything we do is increasingly connected, so will the levels of available data increase proportionately. That is the key at the end of the day, the data. Keeping it secure, managing it efficiently, all while using it to fulfil an undoubtedly massive potential.




George Toursoulopoulos is a technology specialist and CEO of Synetec, one of the UK’s leading providers of bespoke software solutions.

Monday 30 November 2015

Synetec playing golf in support of DEBRA

Synetec were proud to get involved with one of the DEBRA Charity Golf Days this year. The event was held at the immaculate and prestigious Swinley Forest Golf Club.  

Most importantly the day raised awareness and funds for the tremendous work and effort that DEBRA puts into helping and supporting the individuals and families affected by Epidermolysis Bullosa (EB). This is a great and worthy cause, so if you can support them in any way then we encourage you to. Click here to learn more about the great work being undertaken by DEBRA.







The imposing Swinley Forest Club House

Tuesday 27 October 2015

Cyber Security & Learning from TalkTalk


When we last discussed cyber security the point regarding high profile companies being hacked was a relevant one and now with the latest incident regarding TalkTalk, it has just been hammered home once more. This short article discusses what happened and how we can all learn from it

 

Short version of what actually happened

TalkTalk do not know how many of its 4 million customers have been affected by the data breach, but what we do know is that the cyber-attack which took place on Wednesday, 21st of October, affected people’s names, addresses, bank account numbers and sort codes. As TalkTalk put it on their website, these details ‘may have been accessed’. TalkTalk were hacked, the hackers attempted to blackmail them, they refused to pay and reported the breach.

What can we learn from it?

If there is only 1 thing to learn from it, it’s that system data must be encrypted. Most systems have debit/credit card details encrypted, but the possibilities to take advantage of other customer data these days is enormous. There is a thriving black market for stolen data, and you don’t even have to go to the Darknet to get it, it’s readily available online.

The other thing to take way from the TalkTalk incident is to have a proper post-breach response plan, this should include quick distribution of accurate information to the relevant parties (customers and authorities), allowing the proper assistance if required, but also to lessen the impact on your customers and begin the process of rebuilding their trust in your organisation.



George Toursoulopoulos is a technology specialist and CEO of Synetec, one of the UK’s leading providers of bespoke software solutions.

Wednesday 30 September 2015

Software, Data Hacking and Fitness Bands



So everybody, well almost everybody, is wearing a fitness tracker of some variety. Almost everybody, because I haven’t jumped on that bandwagon yet, but that’s another story. With the ability to develop applications specifically for these devices and for these apps to be available across devices, there is an increasing amount of interest in this topic. This short article outlines some basic capabilities, what you should consider and some posed safety risks which you should know about to prevent your data from being ‘hacked’.

Part 1: Software Development

Back in May 2015 Microsoft released a SDK (software development kit) for those who want to create apps for the Microsoft Band fitness tracker. With this SDK developers are able to create applications that can access information from the fitness bands sensors and also allow applications to send notifications from a paired smartphone to the fitness band. This allows developers to create applications that support Windows, access all calorie data recorded and stored in the fitness band and connect to the band from tasks running in the background. The functionality exposed by this SDK includes access to all the fitness bands sensors such as a heart rate monitor, accelerometer and gyroscope. Versions of the SDK are available for each mobile operating system. This in tandem with Microsoft Health (a cloud-based fitness service that offers personalised health related information using data gathered from fitness bands) allows the potential for competition against Apple and Google and there offering.
Ignoring the individual software vendors and fitness bands themselves the scope is simply immense! Not only are existing fitness apps so much more useful, because let’s face it who wants to input all their calories intake or exercise details into an app when it can be done automatically, but the opportunity to provide focused and extremely relevant informed services to the consumer has never been better. As more and more devices become connected and can share your health related information, the more useful they can become. I will admit I was pretty envious when hearing about the WIFI weight scale that shared information with the fitness band and all that information was shared with an app on the users phone to provide amazing data and reporting. With the ability to develop and deliver apps that can access some of these devices, the potential is exciting.

 

Part 2: Your health data

All the data that is recorded by the sensors on your fitness band is available when pairing to the device through the Bluetooth LE protocol, which doesn’t require a password to pair two devices. By using the standard Android SDK you can easily scan for any Bluetooth LE fitness band in the vicinity and attempt to connect to it. In fact the only thing that stops anyone from just connecting to these bands and accessing all the data is if there’s already a phone connected to the device! However, it is also possible to disrupt the connection between a paired phone and a band, giving the software the opportunity to connect instead. So what is the downside of having your band hacked? Not much at this stage, they are still in their relative infancy and only record calorie and exercise related information, but this is also changing and you wouldn’t want your GPS recorded location history available? That could quite easily allow someone to know where you worked, lived, etc.
No doubt the band manufacturers will address this, but until then it’s possible.



George Toursoulopoulos is a technology specialist and CEO of Synetec, one of the UK’s leading providers of bespoke software solutions.

Friday 4 September 2015

Managing Home Working



Many IT roles are ideal for a flexible working policy, however there are always issues to consider when implementing such policies. This article addresses some of these issues and provides some guidelines to ensure that both the business and the employees are better off for it.

Productivity

Certain roles within IT lend themselves to task based planning, which is perfectly suitable for remote working. A software developer for instance is assigned tasks, each task has an agreed timeline and then the dev gets on with it. As long as the tasks are being done to a reasonable schedule and at the quality that has been specified, then the business is better off by having happier employees and they are happier because they can work when and how it suits them. The regular review of the task progress eliminates most of the potential pitfalls. So productivity needs to be quantifiable.

Communication

Email can be cumbersome, so instant messaging and VOIP phones should be used to make life easier. Additionally, for certain types of tasks and early phases of projects, there is no substitute for being in the same location and having a face-to-face. Bottom line is that communications must be easy for all concerned and at certain points there is no substitute, which has to be recognised by everyone within the team.

Company Ethos

It’s sometimes a pre-existing idea within a business that remote working is an opportunity to slack off. If it’s handled correctly the company gets additional hours for no charge because of the additional pressure for team members to be more productive and put in a ‘good shift’ when they cannot physically be seen, that works to the company’s advantage and all parts of the organisation need to understand that. This can be a harder sell in some companies, but that’s part of the challenge.

Summary

With longer and more expensive commutes, cost of office space and better infrastructure available to homes across the country, home working is making increasingly more sense. That doesn’t detract from the fact that some people just aren’t suited to it or they might not have the correct environment at home that is conducive to it, that has to be taken into account. It doesn’t mean they are bad employees or slackers, nor that home working doesn’t work, but it does need to be identified and agreed that it might not be suitable for them. Finally, any employer obligations need to be considered such as health and safety assessments where applicable, insurance, etc...



George Toursoulopoulos is a technology specialist and CEO of Synetec, one of the UK’s leading providers of bespoke software solutions.

Wednesday 24 June 2015

Top 3 Tips to Effective Software QA



Introduction

Software Quality Assurance is and always will be a challenge, furthermore it can be a costly one. To get it right requires a combination of the right people and the right processes, both which require investment and a prioritisation within the organisation. Below are some of the key elements in getting it right.

 

Build the right QA team

Firstly, let’s not kid…hiring good QA’s is hard. There are many low end candidates and a fair bit of needing to separate the wheat from the chaff. We see a lot of CV’s from candidates working in large outsources and most of these candidates are more suited to working in a tightly managed team performing routine tasks. You need to find diligent, bright people that are capable of understanding the systems. The nature and complexity of the systems have a big effect on the calibre of candidate. There is a large difference between testing a simple ecommerce website selling widgets versus a risk management system for a financial institution. These QA’s will need to get into the nitty-gritty of the system, how the users will use it and what the dev’s might not have thought of in order to find the faults. The other consideration is when you hire the right people, due to their calibre you will need to allow for career development and its fairly common to see the right candidates move into BA roles, so the challenge doesn’t end with hiring the right candidate.

Testing Automation

If you are not automating the majority of your testing, QA becomes a mind-numbing and time consuming process that is a breeding ground for human error. It becomes extremely difficult to get the consistent regression testing that you need for high quality software in production. Back to the point above, the right QA’s will have the ability required to ensure quality test plans and the ability to automate them. Automating the regression testing of the product is also the main area where you should consider using outsourced resources if there is a bottleneck, as they can assist in getting over the one-off resource bottleneck. Ideally, automated test scripts should be run nightly on the latest build with the results being reviewed and interpreted every morning by the test team, in turn this should filter to the dev team so that product regression is kept to a minimum.

Storyboards and Testing Plans

All test storyboards should be logged in the product backlog for reference, this number can be quite large, but it should be done. All bugs found by the testing team should be logged to the storyboard too, so that it’s included going forward to assist in avoiding product regression. Going back to the automation, the more test storyboards that have been automated, the lower will be the reliance on manual testing and the challenges that come with that.

 

Conclusion

Ideally, you should treat the testing as part of your product development engineering and integrate the testing with the development as much as possible. Automate as much as is feasible for your scenario and ensure you have high quality QA’s for what needs to be performed manually.


George Toursoulopoulos is a technology specialist and Director at Synetec, one of the UK’s leading providers of software services and solutions.

Monday 1 June 2015

Synetec supporting SportsAid

Synetec were proud to get involved with the SportsAid Charity Golf Day this year. The event was held at the immaculate and prestigious Stoke Park.  

Most importantly the day raised awareness and funds for the tremendous work and effort that SportsAid puts into helping the next generation of British sports stars by giving them financial support and recognition during the critical early years of their careers.Click here to learn more about the great work being undertaken by SportsAid.



The imposing Stoke Park



The challenging and gorgeous 7th hole, which was the inspiration behind the famous 16th hole at Augusta

Wednesday 29 April 2015

Case Study: CRM and Portfolio Management System Integration


Industry: Financial Services

Introduction: The organisation wanted to maximise their return on investment from their CRM system and assist the sales team to develop new business by having the appropriate data from their Portfolio Management System available in a secure and timely manner.

Challenges:
• The business had significant information available within their PMS such as client holdings across funds, positions, subscriptions, redemptions and valuations which needed to be manually retrieved, this in itself was such a time consuming process that it was mostly avoided
• This PMS-based information needed to be reviewed in conjunction with the data stored in the Client Relationship System, looking at all related information in one place made it far more powerful and effective
• The data needed to be up to date, visible to only authorised personnel and not negatively impact the performance nor the stability of the PMS

Objectives:
• The data from the PMS needed to be enriched with all available related information from the CRM
• All relevant information pertaining to the client needed to be accessible from within 1 screen
• The information needed to be available near real-time
• The performance of the PMS needed to be unaffected
• Security should be in place to ensure that only relevant information was available and that any confidential or unrelated information was excluded

Solution:
This Investment Manager partnered with Synetec in order to deliver functionality to enrich and make available information that has improved the performance of their sales team. In the absence of a proprietary API, the data from the PMS was accessed through scheduled nightly reports.

Not only could the sales person view all relevant CRM information of the client, but also any transactions involving the client’s accounts and the valuation of their holdings.

Benefits:
This system allowed the valuable data that was locked in the PMS to contribute to the success of the business as a whole. Initial indications are that sales have improved significantly with the sales team attributing the majority of that increase to this integration and the additional information at their fingertips.



George Toursoulopoulos is a technology specialist and Director at Synetec, one of the UK’s leading providers of software services and solutions.

Tuesday 7 April 2015

Top 3 Tips when taking software to mobile devices


Introduction

Can we have a version of that for Tablet and Smartphone please? We have long since passed the point where tablets and smart phones are being used for business, the challenge is when a proprietary system has functionality that is inaccessible on these devices. This article attempts to address the key criteria to assess before taking that step forward.

Which part?
"We want to use the system on our iPad" is all well and fine, but does the entire system need to be accessible via a tablet? For example, in a CRM system, the list of clients and client contacts is viewed almost every time the user logs on, it's a primary purpose of the system, but do they really to be able to change their user settings or schedule reports to run via a tablet version? It might make sense to take a phased approach and leave the less frequently used parts of the system for a future phase, if at all. The main objective is usually to make the frequently used parts of the system available on different devices, not to make a complete mobile version of the system, making that distinction can save an awful lot of time and money.

Which way?
The existing systems architecture will influence many of the decisions to be made and also affect what sort of effort will be required. Having the same functionality implemented on different platforms will greatly increase the effort and costs when implementing new or changing existing functionality. For example in a windows-based system, it might make sense to move all common functionality to a WCF service that can be called by both the windows application and the mobile version, this would ensure only 1 set of code and greatly increase maintainability. Thought has to go into which platforms to develop for, is both Android and iOS required? If so, can that perhaps be implemented by making a dynamic web-based version of the application (using responsive or adaptive html) that is then accessible and works across platforms. Often a change to the main application can make moving forward much easier.

Watch it!
Maintain the system's integrity is often overlooked in these types of projects, with all the excitement of moving to a mobile platform issues such as system and data security can be overlooked and with this different type of accessibility come different types of security challenges. The usability of the system is also something that can be underestimated, to do this properly the commonly used pieces of the system need to be redesigned so that they are usable on the different devices. To get the most out of the system it will look different on a tablet from a Smartphone.


George Toursoulopoulos is a technology specialist and Director at Synetec, one of the UK’s leading providers of software services and solutions.

Friday 6 March 2015

Quick Guide to Software Security


Security has been a priority for companies for many years now and with so many high profile companies being hacked, it's no wonder. With brute force, dictionary and rainbow table attacks the amount of time it takes to crack a password is frighteningly quick. This guide discusses some of the methods to crack and what can be done to protect your systems against security threats.

 

How is the hacking done?

With massive parallel general purpose graphics processing password cracking and rainbow tables, it's possible for hackers to produce more than 500,00,000 passwords per second, even with low end hardware. Depending on the software, rainbow tables can be used to crack 14 character alphanumeric passwords in about 160 seconds. Faster than how long my daughter takes to unlock my iPhone pass code!

Rainbow tables achieve this by comparing a password database to a table of all possible encryption keys. This requires a large amount of memory, and memory is cheap. With hardware improving a password doesn't stand a chance. Over and above these techniques social engineering still remains a big threat, all the encryption and strong passwords in the world don't mean a thing when the user gives out their password. Phishing tactics are getting better and are very effective, with false emails and forged websites they trick an alarming amount of people into giving up their passwords.


What are the options?

Basically it boils down to single factor or multi-factor/two-factor authentication (2FA). Single factor authentication secures a system through only one category of credentials, for example a login and a password. 2FA is where a user's credentials are made up of two independent factors.





Single Factor

There are challenges with attempting to secure your system with a password. The most common one being that users either don't understand how to make a strong and memorable password or underestimate the need for security.

The extra rules that are necessary to make passwords strong often result in users forgetting them or having problems which results in needing password resets, which often rely on help desks (see costs). Single factor does have its advantages though, it's cost-effective, easier to manage and less things can go wrong.

There are some things that can be done in order to make it more effective though, namely:

  • Passwords need to be long enough (minimum of 8 characters), include a mixture of letters, numbers and be case-sensitive. A password meter is recommended and has been proven to help.
  • Passwords could be partially inputted, for example character 3, 5 & 7 of the password
  • Passwords should be stored in the database in an encrypted format and then the software can verify them via a decryption key
  • Where possible the login and password can be locked down by 1 or more IP addresses (although that effectively becomes 2FA)
  • Users need to be educated on how to protect themselves and their passwords




2FA

As mentioned before, 2FA is where a users credentials are made up of two independent factors, such as:

  • Something that the user knows (PIN, password, questions, etc...)
  • Something that the user possesses (key fob token, mobile phone, smartcard, etc...)
  • Biometric data (fingerprint, iris, voiceprint, etc...)

Obviously some of the above options are going to be more suitable than others and there is a cost implication with each of these. I would like to briefly discuss the more popular options in order to give a better understanding and also because it is unlikely that a company will protect their CRM system with an iris scan. Horses for courses.

Hardware tokens are the most prevalent, most commonly implemented with a user being given a key fob that is combined with a password. The key fob displays a pseudo-random number that changes periodically and the user inputs this number to prove that they have the token. The server that is authenticating the user must also have a copy of the each key fob's 'seed record', the algorithm used and the correct time and then in turn can authenticate the user. The key fob itself contains this algorithm and the 'seed record' and generates the number that is verified by the server. There are different options to the key fob such as USB stick based solutions, for example YubiKey, which is being used Google, Facebook and the US Department of Defense. With such high profile customers and a cost starting from $18 per user it is understandable why it is so popular.

Software tokens are on the rise, the key fob functionality has been replicated for the Smartphone and been in use since the year 2000. The technology is exactly the same as that in use with the hardware version, however instead of needing an additional fob an app on the Smartphone is used. Different software apps are available for smartphone's as well, products like Toopher can verify where the user (or their Smartphone) is physically located and the first time a user tries to login from a new location, they must be given permission to do so via the app. The pricing starts at $1 per month per user.

Another effective way to authenticate a user with the aid of their mobile phone is by sending them a code via text message, this code would change with every request and would expire. This is a relatively simple and cost-effective solution, with companies providing text message capabilities from a couple of pence for each message.




Parting Thoughts

There are many solutions to deal with an ever-increasing challenge that we all have to address in one manner or another. You don't need a machine gun to kill a mosquito though, don't know if that is a saying, it should be, but taking into account the various factors that influence your security requirements is key, so to speak.

The factors would be how sensitive the information is, what would be the repercussions if the system was hacked (customer confidence, regulations, etc...), the user particulars (number of, location, etc...) and costs.




George Toursoulopoulos is a technology specialist and CEO of Synetec, one of the UK’s leading providers of bespoke software solutions.

Thursday 5 February 2015

3 Reasons to NOT move away from Excel Development


Following the feedback I received from the earlier article titled 3 Reasons to move away from Excel, it seemed necessary to talk about why it would make sense not to move away from an Excel-based solution. There are obvious reasons why you would create a solution within Excel, but this article discusses why you would stick with it in the medium to long term and furthermore what you need to plan for initially in order for it to be robust enough to deliver value in the long term.

Dynamic Environments

When requirements are ever-changing, when inputs can vary regularly and outputs needs to be highly configurable then Excel is still an excellent choice. Its weakness is, in this scenario, its strength. While using an application built in compiled code that sits on a relational database for the same scenario can add robustness and scalability, it is also slower than Excel in terms of change. There is a trade off and if the environment is very dynamic, Excel might be the most sensible choice.

Everybody is a coder

In certain environments where requirements require a very specific skill set and where the ability to learn basic programming skills is a complimentary mindset, it can make sense to have the users develop their own applications. A few of our clients are actuaries who are ideal candidates for that scenario. They can pickup VBA coding quickly, they obviously understand the requirements and in that scenario Excel can be the perfect platform. When a more permanent data store is not required and solutions are used for repeatable calculations, a non-trained programmer with all the business knowledge can be very beneficial.

Cheap today, could also be cheap tomorrow

Software licenses, database licenses, support contracts, servers, cloud platforms and development tools are all a necessity in a more structured development environment. There are costs on both side of the fence, it's simply a case of weighing up the costs on both sides along with the requirements as a whole. In the above scenarios actuaries are most definitely well equipped to build their own Excel-based solutions, but the costs of that salary needs to be taken into account.

Excel and Longevity

So, you have decided it makes sense to build or keep an existing solution in Excel, how do you ensure a return on your investment? We get called in fairly regularly to perform Excel System Audits and the primary reason for that is the solution is not performing as it once was (deteriorating performance or causing errors). Often that is combined with a team member having moved on and the solution is extremely difficult to take on for other team members. Managing code of any kind, even within an Excel application, is made infinitely easier if certain basic programming principles are adhered to and those principles can be relatively easily learnt with some initial training. An initial audit can reveal instances of non-optimal coding practises and potential problem areas along with how to correct them. Documentation is another big area that we find can make a difference. As tedious as it might be to create it makes a significant difference and should be absolutely essential. The true rewards of documentation are reaped when that team member moves on.


George Toursoulopoulos is a technology specialist and CEO at Synetec, one of the UK’s leading providers of bespoke software solutions.

Thursday 8 January 2015

Case Study: Broker unlocks the value of their data with Exchange Web Services


Industry: Financial Services

Introduction: The organisation wanted to use the valuable information stored within Microsoft Exchange to help the sales team service their existing customers and develop new business. There was significant advantage to be gained by accessing email, calendar and contact information stored within Exchange and have it available to the appropriate team members in the right context

Challenges:
• The business had significant information available within Exchange such as emails that had been sent to or received from the client, appointments with client contacts and specific contact information and this information was only available through Outlook which made it time consuming to locate, if at all possible
• This Exchange based information needed to be reviewed in conjunction with the data stored in the client account management system, looking at all related information in one place made it far more powerful and effective
• The data needed to be near real-time without affecting the performance of Exchange, which had a significant amount of data

Objectives:
• The data from Exchange needed to be enriched with all available related information
• All relevant information pertaining to the client and all of their employees needed to be accessible from within 1 screen
• The information needed to be available near real-time
• The performance of the Exchange server needed to be unaffected
• Configurable rules should be applied to ensure that only relevant information was available and that confidential and unrelated information was excluded

Solution:
This broker partnered with Synetec in order to deliver functionality to enrich and make available information that has improved the performance of their sales team. The data from Exchange was accessed through the Exchange Web Services API and cross-referenced with the data available in other internal systems allowing enrichment of the data by building a relational model of the data.
Not only could the sales person view all relevant emails from and appointments with the client contact, but they could do so while also viewing their trade history, personal information, call information and company related information.

Benefits:
This system allowed the valuable data that was locked in Exchange to contribute to the success of the business as a whole. Initial indications are that sales have improved by more than 5% on the same period last year with the sales team attributing a portion of that increase to this system and the additional information at their fingertips.



George Toursoulopoulos is a technology specialist and Director at Synetec, one of the UK’s leading providers of software services and solutions.