• TRUSTED RESEARCH

    TRUSTED RESEARCH | STRATEGIC INSIGHT

    SMB. CORE MIDMARKET. UPPER MIDMARKET. ECOSYSTEM
    LEARN MORE
  • INTERWORK 2.0: THE AGENTIC FUTURE OF CONNECTED BUSINESS

    INTERWORK 2.0: THE AGENTIC FUTURE OF CONNECTED BUSINESS

  • 2026 TOP 10 SMB BUSINESS ISSUES, IT PRIORITIES, IT CHALLENGES

    2026 TOP 10 SMB BUSINESS ISSUES, IT PRIORITIES, IT CHALLENGES

  • 2026 TOP 10 SMB PREDICTIONS

    2026 TOP 10 SMB PREDICTIONS

    SMB & Midmarket: Autonomous Business
    READ
  • 2026 TOP 10 PARTNER PREDICTIONS

    2026 TOP 10 PARTNER PREDICTIONS

    Partner & Ecosystem: Next Horizon
    READ
  • ARTIFICIAL INTELLIGENCE

    ARTIFICIAL INTELLIGENCE

    SMB & Midmarket Analytics & Artificial Intelligence Adoption
    LEARN MORE
  • IT SECURITY TRENDS

    IT SECURITY TRENDS

    SMB & Midmarket Security Adoption Trends
    LATEST RESEARCH
  • BUYERS JOURNEY

    BUYERS JOURNEY

    Technology Buyer Persona Research
    LEARN MORE
  • PARTNER ECOSYSTEM

    PARTNER ECOSYSTEM

    Global Channel Partner Trends
    LATEST RESEARCH
  • CLOUD ADOPTION TRENDS

    CLOUD ADOPTION TRENDS

    SMB & Midmarket Cloud Adoption
    LATEST RESEARCH
  • FUTURE OF PARTNER ECOSYSTEM

    FUTURE OF PARTNER ECOSYSTEM

    Networked, Engaged, Extended, Hybrid
    DOWNLOAD NOW
  • MANAGED SERVICES RESEARCH

    MANAGED SERVICES RESEARCH

    SMB & Midmarket Managed Services Adoption
    LEARN MORE

Techaisle Analyst Insights

Trusted research and strategic insight decoding SMBs, the Midmarket, and the Partner Ecosystem.
Anurag Agrawal

Big Data in the Cloud - an ideal solution for SMB banks

Wall Street Journal carried an article on how regulatory burdens had made community banks “too small to succeed” despite performing better than larger banks regardless of being better capitalized and having lower default rates.

The advent of cloud technologies has the potential to change WSJ’s dire prognosis.

Cloud may have first been introduced as a means of reducing CAPEX and/or overall IT costs, but today, it is viewed by small and midmarket businesses as a means of increasing business agility and of introducing capabilities that would have been cost or time-prohibitive to deploy on traditional technology. Complementary to cloud, big data analytics presents the possibilities of connecting together a variety of data sets from disconnected sources to produce business insights whether for increasing sales, improving products or detecting fraud. SMB banks are a specific segment of SMBs who can derive the benefits of customer insight while meeting their mandatory regulatory requirements.

Techaisle classifies SMB banks as those below $10B in assets and medium sized banks as those between $10-100B in assets. SMB banks below $10B in assets often called “community banks” play a very important role in the ecosystem of SMB businesses. Although FDIC, OCC and FRB have different definitions of community banks, it is important to note that these smaller banks not only accounted for nearly half of the total of about $600B outstanding small business loans at the end of 2014 but also play a disproportionately major role in the $1.8 trillion residential mortgage origination market.

Unlike large banks, SMB banks are characterized by George Bailey in “It’s a Wonderful Life”. These banks usually have keen insights on their customers based on personal relationships and carry a tremendous amount of tribal knowledge about their customers which they use to make business decisions. While this corpus of knowledge may not be codified it does make a difference in their business operations. But is that enough in today’s hyper-competitive economy where the relationship is being increasingly controlled and dictated by customers?

Then there is another question, are these smaller banks doing enough to detect fraud? High-risk businesses that have been denied services by large banks tend to move their business to smaller banks who are less equipped to analyze these risks. These smaller banks are unknowingly exposing themselves to fraud as well as compliance risk. Regulations are agnostic to bank size and equally unforgiving of SMB banks as they are of large banks. A cloud-based analytics solution may just be the recipe for success for the smaller banks. In fact, these banks are no different than midmarket businesses (or even small businesses) in their objectives of adopting big data.

techaisle-top-business-drivers-for-smb-big-data-adoption

Monitoring, analyzing and reporting very large volumes of data are typically the largest components of regulatory costs for SMB banks. Many often use antiquated technology and manual processes to manage their compliance requirements. Banks that are able to automate the process of managing data for regulatory requirements can have the added benefit of getting a unique view of their customers through one single technology solution.

According to Shirish Netke CEO, Amberoon, a provider of Big Data solutions for banks, “A lot of the data that is required for regulatory compliance can also be easily parlayed into getting insights on the banks customers and improving business”. Amberoon has built a banking solution for SMB banks provisioned on the IBM SoftLayer cloud.

Security & privacy (especially FFIEC requirements), traditional inhibitors of cloud adoption, are a legitimate concern for banks. After all, banks are the custodians of individual’s money, facilitators of trade and commerce and life-line of businesses. However, it may be argued that these inhibitors have already been successfully addressed by service bureaus. A very large percent of SMB banks outsource their core banking system to service providers such as Fiserv and FIS Global who have built very large scalable service bureaus with the economies of scale afforded by centralizing technology resources.

Aptly put by Noor Menai, CEO of CTBC Bank. “Outsourced technology services are nothing new in the banking industry. There is a compelling reason to use big data technologies in banks if they are available at an affordable cost in a secure manner. Cloud has the potential to provide both”.

Big data analytics in the cloud can be an execution advantage, and may even propel the SMB banks to leap ahead of larger banks on solutions that address both regulatory necessities as well as gain competitive edge from customer analytics. Historically, Siebel, an on-premise solution, was usually deployed in large enterprises and was out of reach for smaller businesses. Salesforce, a cloud solution, changed the perception, adoption, usage, affordability and provided immediate business outcomes. Today Salesforce is used by both SMBs as well as large enterprises.

Combining the benefits of cloud with the advantages of big data analytics may just be the prescription that SMB banks need for business growth (cross-selling, upselling services), meeting regulatory requirements such as KYC/AML/BSA and deep-diving into fraud detection.

One should also not forget that big data implementations require a unique combination of technical, operational and business skills to be used in a sustained manner. Needless to say, these skills are in short-supply but affordable by deep-pocketed larger banks. While some smaller banks including community banks can spend the money to experiment with big data pilots, they do not have the capacity to go through expensive iterations to get it right. While larger banks have the luxury of choosing between on-premise big data versus cloud big data, for smaller banks the choice could very well be between either doing big data on the cloud or perhaps not doing it at all. The remaining question therefore is – which big data cloud supplier will take the lead in educating, evangelizing and then executing on the needs of SMB banks.

Dr. Cooram Ramacharlu Sridhar

What is the big deal with ANN?

In the thirty years from the time Shunu Sen posed the marketing-mix problems, I have been busy with marketing research. I tried modeling most of the studies and discovered that market research data alone is not amenable to statistical predictive modeling. Take for example, imagery. Is there a correlation between Image parameters and Purchase Intention scores? There should be. But rarely does one get more than a 0.35 correlation coefficient. Try and link awareness, imagery, intention to buy, product knowledge, brand equity, etc. to the performance of the brand in the market place and one discovers land mines, unanswered questions and inactionability.

This is where ANN steps in.

Technically ANN (Artificial Neural Networks) offers a number of advantages that statistical models do not. I will list a few of them.

    1. Non-linear models are a tremendous advantage to a modeler. The real world is non-linear and any linear model is a huge approximation.

 

    1. In a statistical model, the model gives the error and one can do precious little to decrease the error. In ANN one can specify the error tolerance. For example we can fit a model for 85, 90, 95 or 99% error. It requires some expertise to figure out whether there is an over fit and what is the optimum error one can accept.

 

    1. Statistical models make assumptions on distributions that are not real in the real world. ANNs make no distribution assumptions.

 

    1. Most ANN software available today do not identify the functions that are fitted. We, on the other hand, have been able to identify the functions that are fitted and how to extract the weights and build them into an algorithm.



How do we bring the differentiation?

Our biggest strength is in data integration that combines market research and economic data with transaction data into a single file. This is tricky and requires some ingenuity. We use Monte Carlo techniques to build these files and then use ANN for building the Simulation models. Optimization then becomes clear and straight forward since we do not use statistical models. Optimization using statistical modeling, which most modelers use, is a nightmare. Most of the large IT vendors and even analytics companies continue to use statistical modeling for Optimization. And therein lays the problem. Neither are these companies aware of the possibilities that ANN can provide. Most modeling is done using aggregate data, whereas we handle the data at the respondent level. The conventional modeling is macro data oriented whereas we are micro data oriented. Hence the possibilities that we can generate with micro data for modeling is huge, compared to macro data.

We have crossed the stage of theories. There are many projects that we have executed successfully that have gone on to become a must-have analytical marketing input mechanism.

Doc
Techaisle

Dr. Cooram Ramacharlu Sridhar

Predictive Analytics – The Predicament

Predictive Analytics (PA) is emerging as an important tool in the area of business decision. Predictive Analytics primarily deals with making a forecast based on several inputs. In this and the blogs that follow I will share my experiences with Predictive Modelling (PM), with a view to contributing to the current knowledge base that exists in the Predictive Analytics World.

predictive analytics - Techaisle - Global SMB, Midmarket and Channel Partner Analyst Firm - Techaisle Analyst Insights Analysis-blog In the world of business most predictive analytical tools are quantitative where numeric data is used for building an input-output model. The output is the prediction for specific inputs. For example: A 10% increase in advertising in January will result in an increase of 1% sale in May is a typical output from predictive analytics.

Common Mistake of Predictive Modelers: Assumption of linearity

Predictive Models are largely based on statistical techniques. Multiple Linear Regression (MLR) model is what most users will confront when they look at predictive models. This model works in the background whether one is using a multiple time series or multi-level modelling.

Multiple Linear Regression Models are developed based on a crucial assumption: the output is linearly dependent on the inputs. But all experience shows that in most business situations the assumption of linearity is not valid. Hence the statistical models have a poor fit and low predictive capability. In addition, the business world also suffers from Black Swan problems that no modelling can manage with any level of confidence.

The net effect of a linearity assumption, which is ubiquitous in almost all statistical modelling, and the resultant poor fit and low predictive capability has led to frustrated user community. Hence, a business executive looks at models with suspicion and trusts ‘gut’ to make decisions.

Predicament

The predicament of Predictive Modellers’ is: How do we get away from the linearity assumption on which almost all statistical tools are based, but it is known that this assumption is a poor, in fact a very poor, approximation of the real world behaviour?

The story of our approach to modelling starts from this predicament that we have been in, along with all others, and the path that we cut out to get out of it.

Dr. Cooram Ramacharlu Sridhar (Doc)
Managing Director and Advisor, Segmentation & Predictive Modeling

Dr. Cooram Ramacharlu Sridhar

Predictive Modelling – Watch out for land mines

Before you attempt any modelling you should first look at the inputs and outputs that you want to go in to your modelling. Here is the matrix:

predictive analytics - Techaisle - Global SMB, Midmarket and Channel Partner Analyst Firm - Techaisle Analyst Insights PA-blog-22-1024x385


What you need to do is to make a laundry list of the variables (inputs) that affect the output. Typically in a marketing company one would look at sales as the output and a whole lot of variables as inputs. Let me look at a few examples for these cells.

1.       Measurable-Controllable Variables

GRPs of your brand through TV advertising are measurable and controllable.

2.       Measurable-Not-Controllable

Inflation is measurable but not controllable

 3. Not-measurable – Not Controllable

The amount of investments made by your competition in dealer incentives is neither easy to measure accurately nor can you have any control. But this activity impacts the sales of your brand.

4. Not Measurable-Controllable

Not measurable generally refers to qualitative issues which are quite often measured by a pseudo variable, for example: Quality of your salesperson.

In your business environment if the majority of your input variables are in Cells 1 and 2, and you feel that these make a big impact, then modelling will be successful. If not, and many variables are in Cells 3 and 4, modelling will not be a success.

Most companies do not undertake this simple preliminary exercise of classifying the variables that impact their business and then hit potholes throughout the design testing and implementation.

Unclassified variables are veritable landmines. Watch out for them.

Dr. Cooram Ramacharlu Sridhar (Doc)
Techaisle

Trusted Research | Strategic Insight

Techaisle - TA