We love to share as much as we like to find answers to some of the interesting problems that we work on.

We have on our side a team of Industry Veterans, Domain Experts, Proficient Business Consultants and Competent Technologists, who have fascinating minds and captivating ideas.

We share what we think, what we feel...

Welcome to the Fulcrum Worldwide blog! See what's happening in the world of IT/Fulcrum, right now!

Micro Services, its needs and its benefits

The problem statement

Enterprise software is often built with a three tier architecture: client-facing interface, the server-side application, and a relational database or other persistent storage module. When the client sends a request, the backend code or server side application processes the same by executing business logic, accessing the database to retrieve data and sending back the html views to the client browser; this server-side application is a monolith - single, logical & executable. Any changes to the system involve building and deploying a new version of the server-side application, even the smallest change requires that the entire system be rebuilt and redeployed.


In a large enterprise application, the monolithic code base creates a lot of complication in terms of code understanding, scaling becomes challenging, continuous integration/deployment becomes complex and often a daunting task. It's difficult to change technology or language framework because everything is tightly coupled and relies on each other.

The Solution

Architect the server side application into a set of collaborating services. Basically the application is functionally decomposed into a set of services. Each service implements a set of narrowly, related functions. Thus changing a service implementation has no impact to other services as they communicate using well-defined interfaces.

In MSA (Micro Service Architecture), a software component is an independent running service that interacts with the other parts through message exchanging. Services are developed and deployed independently of one another. The data governance, data architecture is decentralised which minimise dependencies as a result of this each service has its own database in order to be decoupled from other services. The consistency between databases is maintained using either database replication mechanisms or application-level events.  The independence of services helps with achieving high cohesion and loose coupling, which deliver all sorts of benefits (reliability, scalability, reusability, and so on)


Let’s visualize that we are building a store front like e-commerce application. In a monolithic application, we will have this as a single application consisting of Customer/Order Management, Catalog & Shipping Information. Now in case of Microservices the application should be functionally decomposed like Customer, Order Management,  Catalog components are packaged as separate war files, each archive has its own database it can be relational, NoSql ,flat file etc. Each component will register with the service registry and can scale independently. This individual components need to talk to each other, it's quite common and it is achieved through pre-defined API. REST API can be used for synchronous or pub/sub for asynchronous communication, in our case the Order Components discovers Customer and Catalog service talks to them using the REST API. The Client Interaction for the application is defined in another application. The application mostly discover the services from service registry and compose them together. It should mostly be a dumb proxy where the UI pages of different components are invoked to show the interface


  • Each Microservice is relatively small, leads to ease of development and maintenance
  • This architectural pattern leads to truly scalable systems; Each service can be scaled independently of other services
  • Easy to deploy and independent
  • Fault diagnosis and isolation is easier, for example, a memory leak in one service only affects that service while other services continue to function normally
  • No dependency towards a particular technology stack for a long time


  • Developers must deal with the additional complexity of creating a distributed system
  • Developers must implement the inter-service communication mechanism
  • We need to have mature Dev Ops team for maintaining Microservices based application
  • Implementing use cases that span multiple services without using distributed transactions is difficult
  • Testing of such application is definitely harder when compared to monolith application 
  • Deployment complexity - In production, there is also the operational complexity of deploying and managing a system comprised of many different service types
  • Increased resource consumption – Initial investment to run these applications are high

The Conclusion

The microservice architectural style is an approach to develop a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API. These services are built around business capabilities and independently deployable by fully automated deployment machinery. There is a bare minimum of centralized management of these services, which may be written in different programming languages and use different data storage technologies.

The microservice architecture has a number of advantages. For example, individual services are easier to understand and can be developed and deployed independently of other services. It is also a lot easier to use new languages and frameworks because you can try out new technologies one service at a time. A microservice architecture comes with its own baggage and is not recommended for all projects. In particular, applications are much more complex and have many more moving parts. You need a high-level of automation, such as a PaaS, to use microservices effectively. You also need to deal with some complex distributed data management issues when developing microservices. Despite the drawbacks, a microservice architecture makes sense for large, complex applications that are evolving rapidly, especially for SaaS-style applications.

Understanding the security mechanism of Apple’s iOS

Problem Statement:

In today’s world, many of us use iOS devices and we are completely aware of how this system has evolved with its releases over a period of time. Many of the iOS users are not aware about how much efforts Apple has put in to make their data secure and accessible at the same time. So in this post we are going to have a look at few key features of iOS that are being used daily by the iOS users and Apple’s strategy on safeguarding them.

The Prerequisites

The readers of this post should be acquainted with the key terms of Apple’s ecosystem that we are going to use in this post like touch ID or passcode, readers who already have an iOS device should not have any issues as they’re already familiar with such key terms, also basic knowledge of encryption will be an added bonus.

Touch ID: Touch ID is Apple’s way of unlocking the device using biometric technology; the reason why they invented this so that the user does not have to go through the pain of entering their 4-digit passcode (phone’s security lock code to unlock the phone). The user can now place his thumb on the home button of the iPhone and his device will get unlocked. Touch ID is available from iPhone 5S onwards.

You need to add your fingerprint in the iOS device to use Touch ID, if you’re getting worried that Apple may store it somewhere in its cloud then you’re wrong, Apple does nothing like that.

iOS device does not save your fingerprint with itself, what it does is it stores a mathematical representation of it in the iOS device chip with an advanced security architecture called the Secure Enclave (also known as the Trust Zone) which is the same architecture mechanism used to save your passcode. The secure enclave makes sure that your fingerprint representation is secured by using its world-class secure and save algorithms.

The fingerprint data is encrypted using a key only known to the secure enclave and is used only by the secure enclave to verify your fingerprint.

When one touches the home button the Touch ID sensor gets activated and takes a high definition snap of your fingerprint, this data is transferred to the chip via a peripheral interface bus and is forwarded to secure enclave for decryption. If the unique characteristic of the snap match with the one already stored in your device you’re authenticated and your iOS device will be unlocked. Once processing and analysis is completed the fingerprint snap is discarded and is not stored in any of the Apple services.

You can even use Touch ID to make purchases in various Apple digital media stores, developers can use Touch ID in their application using the LocalAuthentication.Framework.

Passcode: In general passcodes are like the lock screen passwords which act as security gatekeepers of your iOS devices. Passcode accept 4-digit password to unlock the device. This 4-digit password is set by the user using the setting screen of iOS device.

The advantage of having a passcode is when an unauthorized person tries to attempt your 4 digit password by entering random guesses several times, the device identifies that incorrect codes are tapped in and it then adds in longer pauses between the attempts to enter the passcode.

Example: If a user is trying to access my iOS device which is passcode protected it will try to guess my passcode 4 digit number and enter any 4 digit number as per his guess, the iOS device will identify that a wrong passcode is entered and after several attempts it will ask the user to enter the passcode after 1 minute and will increase this time interval every time a wrong passcode is entered.

The total limit to enter an incorrect passcode is 10 tries after that the iOS locks up the device for good or depending upon your security setting wipes out the data.

When you turn on Passcode you turn on another layer of security called as data protection which creates a new encryption key used to encode certain files marked critical by the OS like the keychain (explained below) for example.

In any encryption mechanism you need a key to encrypt your data, in the iOS the user entered passcode is that key and this key is never stored in the Apple chipset or the secure enclave, in this way even Apple is not aware of your passcode as it’s not physically stored in the device.

Since passcode is not stored in the device the only way, which your device data can be hacked (if ever fallen into the wrong hands), is by using brute force attack, i.e. the hacker will try every 4 digit combination till it finds the right one. In coming iOS 9 release Apple will ask iOS device users to enter 6 digit passcode rather than 4 making it all the more difficult for the hackers to crack it because a 4 digit code can be cracked easily but a 6 digit code is even more difficult to crack.

Keychain data protection: The iOS keychain provides a secure way to store user sensitive data and private keys for your application; hence from the development standpoint, any sensitive data has to be stored in the iOS keychain rather than storing it in plist (property list) or NSUserDeafult.

Keychain items can also be used to share data between two applications provided same developer makes the application, iOS stores data in keychain using AES-256 encryption technique.

Apple provides its own keychainWrapper class to developers, which helps us in storing any sensitive data in the iOS keychain during development.

File Data protection: Every file in iOS is encrypted using the data protection method, when a file is created, the data protection creates a 256-bit key (known as per-file key) and gives it to the iOS device AES engine which uses the key to encrypt the file.

The per file key is wrapped with the class keys which determine under what circumstances the files should be accessible, once the per-file key is wrapped its then stored in the file’s metadata. The file’s metadata in the iOS file system is also encrypted using a random key, this random key is created when the iOS is first installed or when a user wipes the device that can also be termed as the file system key.

When the iOS system requires to open a file, file’s metadata is decrypted using the file system key of iOS, this decryption reveals the wrapped per-file key, which is then unwrapped using the class key which was earlier used to wrap it, and is then sent to the hardware AES engine that finally decrypts the file.


The above topic is just a summary on how Apple has plans to keep the user data safe, the iOS system is so smart that it starts for the encryption when your OS boots up and checks if everything is secure. Hope this post has helped you in understanding few security concepts of the popularly used iOS system.

Fulcrum looks forward to demonstrate its capabilities at ucisa cisg 2014 conference

What is it that the colleges and universities need to become truly accessible to students? How can they resolve the issue of isolated and stand-alone academic information systems? Why institutions are now keen on bringing academic, administrative, and social systems on a single platform? All these and many more questions pose a concern for the IT heads of the Higher Education Institutions (HEIs) that have lately realized the importance of creating a collaborative environment for students and staff alike.

It's no secret that the outdated student records systems are preventing HEIs from putting students at the heart of their system. To help universities escape this predicament, Fulcrum will join other Thought Leaders at the UCISA CISG Conference 2014 in Manchester from 12-14 November.

During this 3-day conference, team Fulcrum will demonstrate its problem-solving capabilities and the challenges it had overcome to improve universities' information delivery systems that support students throughout their academic journey.

While some institutions need a complete remodelling of their information architecture, others want a long-term IT strategy in place. From the prospect to alumnus, there's a greater need to provide students with an excellent service. As students and university staff expect to have a robust IT-enabled environment even beyond their institution, Fulcrum continues to deliver out-of-the-box solutions to enable HEIs to remove barriers of operating a multi-site campus.

In the last few years, IT expenditure of universities has skyrocketed due to the existence of too many applications and vendors. Multiple sources of data fail to deliver the learning experience that the universities want to. The UCISA CISG 2014 Conference is going to be an ideal platform for the HEIs that are not yet in sync with the concept of digital campus.

By the virtue of having partnered with leading HEIs, team Fulcrum has gained the experience of making learning technologies more accessible to students and widening their participation. Fulcrum has also helped HEIs realize some of the strategic benefits of cloud computing. Its homegrown portal framework is one of the earliest cloud-based initiatives on Office 365 for the HE sector. This new age portal has demonstrated the ability to reach a wider network of students and deliver information on demand.

Fulcrum's attendance at the CISG 2014 conference is aimed at making colleges and universities aware of the need to improve quality and timeliness of data provided to the students and other stakeholders. We have helped HEIs reduce the data burden by creating Interconnected and Interoperable information management systems. This time too, we are ready to set strategic directions for them.

As a visitor, you have lots to learn:

  • What are the Next Big Things in IT for HE Sector?
  • How IT governance can be structured to ensure swift and effective decision making?
  • How to define an IT strategy that can shape the future of HEIs?
  • How cloud adoption can help HEIs deliver better student experience?

We look forward to seeing you all at the CISG Conference.

To know more about the event, visit: UCISA CISG 2014


We are making a resounding progress in 2014 on all fronts and a flurry of global recognitions is really establishing us as a name to reckon with. To inform you about the very recent feat, Fulcrum has won the Fast 50 Asian American Business Awards. This award is given to a select few who have "contributed to the vitality of the global economy." According to the US Pan Asian American Chamber of Commerce, Fulcrum has "demonstrated resilience and performance excellence despite the challenges in the US and world economy."

It's a great honor to be in the league of Fortune 500 organizations who have achieved this milestone. More excitement is on cards as Fulcrum will be officially awarded at a glittering ceremony on June 2.

Have a glimpse at some of the achievements that recently came Fulcrum's way:

  • The Smart CEO team has chosen Dhana as one of the forward thinking management leaders in New York and he will be felicitated with the Executive Management Award Award on 30 April.
  • Fulcrum was shortlisted for the prestigious KnowList Leadership Award . It's one of the most valued industry recognitions in the UK.
  • Fulcrum was shortlisted for the Company Showcase session at NASSCOM Product Conclave. Our team leveraged this important platform and presented Fulcrum before the industry stalwarts.
  • Rajesh was invited as one of the key speakers for the New York SmartCEO's Round Table Discussion on Emerging Markets. His views were highly applauded by the other panel members and the august audience. That did a lot of good to Fulcrum's visibility.
  • Suvarna was recently invited as a Celebrity Guest for the TimesJobs High Tea Session. You can read the chat transcripts here . TechGig - a popular online news portal featured an exhaustive article on Suvarna's session, which has already received about 1300 views.
  • Rajesh's session with the Engineering students in Mumbai in March has been a huge success. That session further augmented Fulcrum's employer branding initiative.
  • Leading Industry Body NASSCOM recently collaborated with Fulcrum and invited Rajesh to conduct Mentorship Session for Product Entrepreneurs. The session saw an impressive turnout and an overwhelming response.


The Insurance sector is investing in technology enablement with an objective of propelling significant innovation. This trend can be leveraged only when the IT leaders take into account the hot-button issues pertaining to this sector - Fast changing demographics, buyer values, and increased industry regulations. The sector is primarily counting on IT partners to increase the level of engagement with customers and empower them.

Enterprise Mobility Gaining Ground

It is closely monitoring the growing influence of mobile technology and the trend of converting Big Data into actionable insights. The CIOs of insurance providers are almost unanimous in their approach towards mobility. It is certainly the key frontline for business innovation. Rapid surge in consumer mobile apps has been transforming the way they conduct business. More and more insurance firms are looking to provide mobile capabilities for policyholders and agents.

The industry's IT roadmap is not just restricted to downloadable apps for employees, agents, and customers. It has a definite strategy to adopt mobility solutions to meet important business goals such as effective claims processing, 'on-the-move' connectivity with the customers, and empowerment of field marketers. Enterprise mobility is soon going to become more familiar in insurance sector with the growing comfort of Bring Your Own Device (BYOD) and mobile security solutions.

Deriving Valuable and Actionable Insights out of Big Data

Insurance is predominantly a data-driven industry. The data are emerging at an incremental pace. What throws a tough challenge for the CIOs of insurance firms is the volume of unstructured data. That's precisely why they lack true insight into customers, which is affecting enterprise risk management. The sector encounters two-fold problem. Firstly, it is yet to go through technology modernization needed to support Big Data and secondly, it is deprived of any valuable and actionable insights despite having the data available.

Lately, the insurance providers have become heavily dependent on third-party data to verify claimant information and assess fraud risk. It has added to the already existing pile of data. IT players have to come to the rescue of the industry that continues to dwell upon two crucial questions: How to aggregate huge volumes of data? How to analyze the data to make intelligent business decisions?

Automating Regulatory Compliance

Adapting to the industry regulations and aligning them with the business process is a recurring challenge. To implement compliance practices consistently, it requires flexibility in technology systems, which is presently not the case. In such business scenario, automating regulatory compliance is the swiftest way of assuring workflow modification as and when new regulations need to be enacted. Insurance players want to position themselves ahead of their rivals when it comes to addressing new and emerging mandates. It's a vital element of their strategy to win customers' confidence.

The recent trends show that insurers are fast making a switch from their legacy environments to automate regulatory compliance. The idea is to get instant access to information and eliminate the risk of non-compliance. They want to be change-ready and at the same time be fair to policyholders.

Focus on Improving Agent Experience

Since agents bring in the large chunk of business to the insurers, the latter is now more focused on augmenting agents' experience. Survey reports reveal that 'Agent Portal' has recently caught their imagination. They are eyeing at an easy-to-access portal for the field marketers, which would empower them with transactional capabilities. To build a robust portal environment, insurance companies need a comprehensive IT roadmap.

However, improving agent experience is more than just providing real-time connectivity. The IT enablers have to work as strategic partners with the insurers who want to know, 'How can we enable agents to be more productive and generate more business?' and 'How to reduce the time agents spend on non-revenue generating activities?' This is where the technology leaders have to deliver. They have to churn out solutions keeping in mind the ease of use and ease of deployment that will determine the outcome.