InvestCloud Community Service Day with Food on Foot in Hollywood

pasted image 0

InvestCloud’s team members and their families gathered for a good cause on Sunday, October 25, 2015. InvestCloud sponsored the entire day organized by Food on Foot. We helped by distributing pre-packed food and clothing to the homeless and less-fortunate.

Read more by clicking on image.

InvestCloud, Inc Appoints John Stuart as Their Chief Marketing Officer and EVP of Hybrid Solutions

“We are very excited that John has joined InvestCloud. With his leadership, we are in an even stronger position to help empower our clients to enter their next phase of growth, whether they operate locally, nationally, or internationally,” said John Wise, CEO of InvestCloud, Inc. “As a seasoned professional with extensive knowledge of both technology and business operations in the Investment Advisory and Broker Dealer spaces, John brings invaluable talent and diversity of experience.”

Click here to read more.

Leverage – The Art of Software Generation

The days of relatively simple requirements and $100,000, 6-month changes, are behind us; although for many companies, these types of issues are still very much mainstream.  With years of industry knowledge under our belts, along with an understanding of it’s mainstream challenges and shortcomings, InvestCloud took a different and “leverage-centric” approach.

Read more from InvestCloud’s CEO, John Wise at http://investcloudsolutions.blogspot.com/2013/04/investcloud-white-paper-leverage-art-of.html?q=data+warehouse

John Wise is a founding partner and CEO of InvestCloud, and is a serial entrepreneur.  In his role, John is focused on product design and promotion of the world’s first investment management platform in the cloud.  John was the founder of the most successful data warehouse company to the financial asset services firms to date.

John Wise introduces Cloudonomics for Software (part 2 of 4)

The increasing number of vendors and varying types of firms providing a “software” cloud is the most significant paradigm shift in technology happening today. Simply, consumers and businesses are not only becoming more comfortable with, but also beginning to prefer, cloud-based software platforms where there is no hardware visibility from the user’s or CTO’s perspective.
 
Reduced Cost of Updates
Cloudonomics is optimized if the applications (software) hosted in the cloud are truly “multi-tenanted,” meaning that an individual application is set up to service multiple customers. An immediate benefit of this approach is that a single software update will simultaneously benefit all users. Only one update is necessary, as opposed to the traditional difficulty of enterprise software that requires updates to occur on each site remotely, and in many cases doesn’t allow for local configurations. In an enterprise software environment the potential for negatively impacting employees or clients during an upgrade is greatly increased because of the varying number of computers and servers. The economic cost of this is lost time and productivity.
 
A cloud vendor’s ability to provide simultaneous updates generates a significant cost and time benefit to the company. This is because of the elimination of the need for technical experts to implement periodic updates. Additionally, if the deliverable to the consumer relies on the updates, then the simultaneous updates will also generate a cost savings.
 
Enterprise software updates are also hazardous to work efficiency because they are often large undertakings, which forces the retraining of users and the reintegration of other applications. This can be a source of frustration for consumers who are used to a previous version of the system and do not have the time to train and adopt large changes.
 
The cloud brings with it the concept of functional adoption, whereby functions are adopted incrementally as they are used. Small function-focused software updates can be consumed more naturally than large, monolithic updates across various functions and features. Also, typically, many of the solution integration/upgrade requirements are handled by the cloud supplier, and therefore the consumer is unaware that their inter- faces have been changed. A cloud company might release software weekly, whereas an enterprise software company might look to do large upgrades annually. Practically, cloud updates are smooth. For example, Google functional updates are gradual, and the user experience is positive. In sharp contrast, Microsoft’s update from XP to Vista was extremely difficult and unpopular.
Overall, software Cloudonomics results in a more cost-effective solution to implement software purchases and provide maintenance. The result is a simple and frequent adoption of changes while reducing the training and interface rework requirements. The savings can be in the range of $100,000 to even millions for a large enterprise.
 
Improved User Experience
Leveraging software clouds also frees up some of the traditional design constraints associated with enterprise software and allows for the development of user experi- ence-centric interfaces.
 
Traditionally, user experiences with enterprise solutions are designed by the programmers who built the application. Programmers are, by nature, more technical and generally accepting of user interfaces that are “C:\” (a prompt to take instructions), a technical interface that is not led by design or the user experience.
 
Meet PADI: Presentation, Application, Database and Interfaces (P<>A<>D<>I = pure PADI). Software designers are aware of the need to divide solutions into the four-layer PADI. The issue that enterprise software programmers face is that they suffer from a discipline that is weak, and the Application logic gets spread throughout the Presentation, Database and Interface layers (i.e., PA<>A<>AD<>AI). Another deficiency is that the presentation has been wedded to the application logic, which results in difficulties when trying to change the application. The result is that enterprise software is very difficult to move to a native browser, and various slow (non-cloud) software are used to stream desktop interfaces to the customer (i.e. cloud fake). This approach may be the only approach for some enterprise vendors.
Software Cloudonomics addresses the user experience from the point of view of usability, accessibility and flexibility. Usability requirements are functional, psychological and aesthetic. The functions need to be obvious and honest, with their value self-evident. The psychological and aesthetic needs relate to the positive experience of the solution and how intuitive and self-explanatory are the solution’s functions/workflow.
 
The technical requirement is to ensure pure PADI. If this is achieved, the presentation layer can be designed by user interface experts to achieve the expected software cloud user experience; multi-client platforms (many browsers, iPad, smartphones, etc.) are practically mandatory and force the separation.
 
Leverage Solution
Cloud software can provide substantial leverage from other cloud suppliers. This can be achieved by mashing (combining) other suppliers’ user interface functions and data from disparate sources. If done well, the user is unaware that an application might combine several mashed points.
 
More sophisticated mashing is possible through a cloud vendor deploying a repository into its cloud. The repository can be integrated in the cloud, allowing multiple vendors to provide a data storage area that is neutral to the integrated vendors. The cloud vendor provides the presentation and application logic to the repository as a database (repository/warehouse). Cloud repositories represent an extremely powerful method of providing often talked about but rarely implemented requirements such as “enter once and only once”; “all in one place”; or “primary location of access, control and entry.” A cloud repository is a simple approach to achieve these requirements and provides better access to information.
 
Overall, a software cloud consumer can benefit from a) lower cost, b) a more intuitive interface and user experience and c) better access to information than the 1980s enterprise computing.
 
Level 2 Software Cloud Concerns, Objections and Barriers to Adoption 
Typically, internal technology departments drive the concerns and objections to software clouds. These objections often include:
a) Data and cloud security
b) Ease of customizing and reacting to customer needs
c) Existing investment in legacy enterprise solution
d) Asset write-downs of existing hardware and software
 
Addressing These Concerns
a) Although security will be continually debated, data in an enterprise deployment environment is less secure than when stored in a professionally managed and secure data center. In that case the primary importance is not where the applications and data reside, but rather that the correct level of user permissioning is configured, which increases overall security and offers a full audit trail of user access.
b) Being able to react to the customization needs of clients needs to be addressed and tested on a vendor-by-vendor case; however, it is often the case that an agile business manages this process internally, alleviating wait time for an outside vendor. Cloud can be reactive, but this will be de- pendent on selecting the right vendor.
c) There are various ways to not only leverage and extract continued value out of existing infrastructure (as discussed previously) but also ways to resell equipment if it is no longer wanted onsite.
d) Due to the accelerated depreciation of technology equipment these days, this should not be a consideration, as the equipment can remain on the book while leveraging the machines for development, testing or backup, and within a very short amount of time, the cost savings far exceed the consideration of the assets’ values on the company’s financials.
 
The perceived risks associated with adopting a cloud operating model need to be weighed against the likely improvements in governance and security controls a focused supplier will offer versus an in-house or part-time administration resource. 
 
The overall reduction to the firm’s bottom line may also include a reduction in IT staff. As to be expected, internal IT departments will be protective of their jobs and therefore may be prone to keeping their objections high on the agenda. However, some IT teams have the vision to look ahead, and to them and their teams, the cloud represents an opportunity to shift IT focus from server updates and maintenance tasks to innovation, managing cutting-edge technology and creating new applications for the users.
 
These points need to be individually considered against each cloud vendor while weighing the overall business advantage. In addition to the direct economic advantage, there are other important factors to distinguish when migrating to the cloud, such as opportunity advantages, increased scalability for company growth and one that is often missed by firms: core focus versus context (i.e. focus on the primary business and key differentiators versus necessary business utilities).
 
Software clouds can also be referred to as Software as a Service (SaaS). The other popular term is Platform as a Service (PaaS). PaaS provides a computing platform and a solution stack as a service. The PaaS layer lies between the SaaS and IaaS. PaaS is often seen as a more custom/development platform for a solution (i.e., full life-cycle support for software solutions custom to a customer). Unfortunately, be aware that some SaaS and PaaS suppliers are not compliant with the hardware cloud requirements, and therefore the benefits of Cloudonomics might not be there.
 
**For a complete copy of white paper, please contact Kim Wise at kwise@investcloud.com.
Visit http://www.investcloud.com to learn more.

Data warehousing 3.0

Data Warehouse 3.0 “the Interactive Data Repository”
Data Warehousing 3.0 born of issues resurging from the early data warehouse versions, namely data warehousing 1.0 (the data mart) and data warehouse 2.0 (the schema). I am continually asked what is the difference between 1.0, 2.0 and 3.0 and why are the differences important. What follows is an illustration that may provide some understanding.

Data Warehouses 1.0 (the Data Mart)

Data Marts are often referred to as a “subset” of a data warehouse; unfortunately data marts, and, in fact, instances of data warehouses 1.0, are no more than a SQL table representation of the processing system that is providing the data, such as a trading system, or accounting system. These data marts were born from the generally “poor reporting” of the originating processing engines created in the last century. The benefit of a data mart is that you can report, but you are limited to silos of data with extremely limited navigation (i.e. you couldn’t navigate from performance data to position data, to tax lots, or to allocations, as the data would be in 3 different data marts with different references, and different keys and foreign keys. Put simply, when creating a data mart, a developer typically takes an existing extract from a processing engine (e.g. trade history) and loads it into a SQL table verbatim, with minimal or no “warehouse” functionality.

Data Warehouse 2.0 (the Schema)

The requirement to “navigate” and “drill through” is sometimes called data mining; data mining, the need to provide analytics across data sets, has driven difficult task of designing holistic schemas.  However, schema-based data warehouses typically suffer from at least 1 of 3 issues: 1) the schema is poorly designed and the desired mining / reporting / analytics are practically impossible to achieve; 2) the ability to view and extract data is prohibitively slow or 3) the ability to load data is too slow, which has resulted in incomplete data sets.

There are several different approaches for loading data into warehouses, with the more popular approach being to load data through staging tables (i.e. starting with un-typed data and loading/transforming data into hierarchal groups, often called dimensions, and into facts and aggregate facts). The combination of facts and dimensions is often called a star schema – a very trendy approach 15 years ago.

The task of creating an effective data warehouse in the securities industry was indeed very challenging, and, as a result, in the early 2000s there were only two securities industry oriented data warehouses available (Eagle Investment Systems and Netik LLC).  Eagle was typically deployed at large asset managers, whereas Netik was much more successful within asset services companies (e.g. Custodians, Prime Brokers, Fund Administrators, and Exchanges) where the data volumes were very large.

The key components of data warehouse 2.0 that differentiated it from data warehouse 1.0 are:

  1. Industry-specific schemas

The combination of normalized reference data with de-normalized numerical results was the practical solution the securities industry needed. De-normalized data resulted in user-friendly representations of data that were quick to access and understand, whereas the normalized approach resulted in multiple tables that were more complex to understand, but it facilitated data conformity where it was necessary (i.e. performance attribution).  Both normalized and de-normalized data were prevalent in this period.

  1. Flexible ETL

Extract, Transform and Load (“ETL”) is the process of collecting data: a) Extracting data from source systems; b) Transforming the data into a defined schema; and c) loading the data into a new physical database

  1. Atomic Data Sets

One of the key challenges for most data warehouse engineers is the question of where you can make data atomic (a data set with 100% integrity with cross references to other data sets). An example would be the prior day’s trades adding up to the beginning of day positions.

The problem with Data Warehouse 2.0

Typically, views of data were created by programmers; data extracted into Excel or various business intelligence solutions were connected without true attention to business needs.  The real problem was that views needed to be programmed and were really not user friendly for business analysts or operational users. In general, data warehouse 2.0 achieved results that were better than straight accounting (source system) based reports, but the ideal of truly interconnected data was never achieved.

In addition, the cost of ownership was very high, and the creation of new views for users was also very expensive and slow to deploy – typically this was a frustrating process.

Data Warehouse 3.0 (The Open User)

The core driver of Data Warehouse 3.0 is the openness of the warehouse to users: the combination and interaction of many non-traditional data points (such as media and documents), and the ability to perform analytics on data in real time. The requirements for points 1,2 and 3 (from data warehouse 2.0) above still exist; however, now ETL has been extended to include real time data feeds such as exchange prices and cloud-based documents. Data Warehouse 3.0 also includes:

  1. In-memory non-SQL based data access

This is a substantial difference between Data Warehouse 2.0 (with its dreadful DBMS-specific SQL stored procedures), and the ability to render and access data in memory via Data Warehouse 3.0. The benefit is super fast queries and data interactions. Point 7 below “mashing” requires substantially more “data hits”: and therefore would not be possible without the achievements in speed that came with data warehouse 3.0. InvestCloud’s approach has been substantively unique again, with the further introduction of message-based queries and interactions. The thinking in Data Warehouse 2.0 moved the warehouse beyond an Operational Data Store (ODS) or data mart, and therefore there was always a need for a second system that is an ODS. However, with Data Warehouse 3.0 this is no longer the need. The ODS need is met simultaneously with the need for quick access to normalized and de-normalized data.

  1. Big Data & Real-Time Data

The differentiator of our Big Data solution is the “aspect” of the data. Today, we store all of aspects prescribed in data warehouse 2.0; however, it further links to market information (social media, news feeds, and regulatory reporting). The benefit is a massive amount of data that can be used for investment decision making a) Information Advantage (e.g. access to data that can be difficult to gather and collate); b) Analytical Advantage (historical and pro forma models); and c) Behavioral Advantage (push notifications related to events and data events). In the past, this information could be accessed, but it was un-integrated, never in one place, and there was little or no concept of history or time-series data management for anything other than portfolio data (i.e. positions and trades).

  1. Views and Interaction (Generation and Personalization)

Big and Real-Time data engender the potential for thousands of views of this powerful and rich data set. Accordingly, a new requirement exists to provide users with the ability to personalize their views and then store and allow recall of the those customized views, for everything from simply adding or subtracting data columns to detaching views into applets (small powerful functional components). This also includes allowing users to assemble personalized dashboards – and this is all must be achieved without one second of programming/coding. All of the views in Data Warehouse 3.0 are generated and, therefore, the number of permutations is endless. Today, InvestCloud has over 2,000 views of information that have been built without one programmer writing one line of code.

  1. Mashing

The power of applets results in the empowerment of the user to have control and the ability to combine many data dimensions and aspects from the Data Warehouse 3.0. Mashing also enables many different third party components to be assembled to provide an even more powerful experience.

The above is “secret sauce” of data warehouse 3.0. The problem with the above is that they are not that obvious. However, they are fundamental to design and usability. I hope the vision of points 4, 5, 6, and 7 are useful and provide an insight into Data Warehousing 3.0 and InvestCloud.  In InvestCloud SaaS we have implemented data warehouse 3.0. Today the time it takes to implement Data Warehouse 3.0 with InvestCloud SaaS is weeks, not months.

. . .  and finally, The three “Ts:”

  1. T[…] – something that starts with the letter T.
  1.  T[…] – something else that starts with the letter T.
  1. T[…] – finally, something else starting with the letter T.

The above three “Ts” are the “secret sauce” of Data Warehouse 3.0 and have been designed and implemented via InvestCloud. The challenge for most data warehouse developers is that the three “Ts” are not that obvious, nor will we publish them here.  However, they are fundamental to design and usability.

I hope the our illustration of points 4, 5, and 6 are useful and provide an insight into Data Warehousing 3.0 and InvestCloud. In InvestCloud SaaS we have implemented data warehouse 3.0 including all the “Ts”. Today the time it takes to implement Data Warehouse 3.0 via InvestCloud SaaS is weeks and not months (or years, as was the case with Data Warehouse 2.0).

In conclusion

The final problem we had with Data Warehouse 2.0 was the enterprise deployment model (now resolved in 3.0).  Organizations that implement enterprise software struggle with implementing updates and patches to the software. There is a tremendous, typically undocumented, cost of ownership for enterprise software born from the time IT teams must sped to maintain, update and re-test. InvestCloud, through our SaaS, PaaS and BPaaS cloud models now implements updates for all customers weekly, rather than yearly.  We take on the cost of testing and updating and our clients save considerable time and money.

Original article at: http://investcloudsolutions.blogspot.com/2012/05/data-warehousing-30-understand.html

John Wise introduces Cloudonomics (part 1 of 4)

Cloudonomics is the economic effect for the consumer and the supplier of selecting a cloud-based operating model (cloud hardware, cloud software and cloud service) as opposed to an enterprise software and services model. While the “cloud” is commonly used inside the home by most families (e.g., email, social media, Gmail, Facebook, Amazon, etc.), it has yet to be widely adopted in the workplace. As the “cloud” moves into the work environment as a better and more cost-effective alternative to the 1980s approach of enterprise software, this situation is changing rapidly, as discussed in this paper, and will have significant impact on suppliers and consumers.
 
In this white paper, the “cloud” will be viewed in three distinct levels: Level 1 incorporates only hardware; Level 2 includes hardware as well as software; and Level 3 encapsulates hardware, software and services. Varying degrees of Cloudonomics are applied to each of the three levels.
 
True cloud software will result in better user interfaces and productivity, the ability to cloudsource, and even significant savings in hardware costs.
 
The true benefits of cloud are significant, but be aware of cloud “fakes” as they will result in new costs with no benefits. Outsourcing hardware is not uncommon; however, this doesn’t equal cloud. There are software vendors that will provide a browser interface, but when achieved through slow desktop emulation, this also doesn’t equal cloud, and there are services that have been outsourced but rarely do they actually equal cloud.
There has to be consideration for enterprise solutions versus public clouds and private clouds before discussing the three levels of cloud. 

Enterprise Solutions
The challenges inherent in legacy enterprise software solutions include high cost of ownership and support, lack of scalability, recoverability and poor security. Most enterprise solutions are poorly or often not fully tested to a state of functioning recovery (i.e. hardware may fail over but software does not). Malicious insiders and access vulnerabilities do exist, making internal users and passwords left on sticky notes, a serious weakness. Additionally, most enterprise solutions have limited resources, procedures and controls for providing infrastructure, platform and services. Keeping the software up to date in the enterprise environment can be a significant job all together.
 
Public Cloud Solutions
A public cloud has clear financial and operational benefits; however, these benefits should be carefully considered against security issues. The public cloud vendors (infrastructure, product or services) will continue to have a wide range of issues that will remain in the forefront of the industry mind. As with any business there are data sensitivities that need to be addressed and thought through when looking to share consumer-level clouds.
 
Classic issues include shared hardware issues, unknown vendor sites, unknown staff, data leaks, insecure API (application programming interfaces), location (e.g., outside the United States) and malicious insiders.
 
Private Cloud Solutions
Private cloud vendors undertake due diligence on the service chain to ensure the highest standards are achieved. This results in customers being satisfied that a private cloud offering offers them the security and configuration fitting for their business. Important advantages that a private cloud offer are known providers and vendors; physical checks and monitoring of
data centers; known staff with established processes, procedures and controls; defined contractual terms; and fully tested and scalable disaster recovery.
 
Typically private cloud vendors will ensure that there are no shared technology issues (i.e., dedicated hardware that is not shared with other companies). This eliminates the possibility of third-party data contamination (i.e. data loss or leakage) and technology update vulnerability.
Private cloud vendors often have defined security practices that include options on security from Secure Socket Layers (SSL), Virtual Private Network (VPN), encrypted data storage, host intrusion detection systems and federated identity manage- ment systems to overcome alleged security issues.
 
Normative Cloudonomics (advocating “what ought to be”) and the current enterprise software and outsourcing model will result in challenges that enterprise suppliers need to overcome. The challenge of creating multi-tenanted and virtualized software is difficult; however, several leading cloud companies, such as Salesforce.com, have demonstrated that these obstacles can be successfully addressed.
 
The combination of hardware cloud, software cloud and cloudsourcing will result in a paradigm change to the enterprise option looking less and less attractive as the market adopts cloud.
 
Cloudonomics provides significant benefits that include:
  • More cost-effective hardware options
  • Better recovery options
  • Reduced cost of software, integration, maintenance and training
  • Pay-as-you-go pricing models
  • Better user experience
  • Improved time to market
  • Improved access to information (mobile)
  • More extensive access to integrated information
  • Access to wider service offering

Cloudonomics will result in what is commonplace in home computing (ie. eBay, Amazon, Google, online banking, etc.) becoming commonplace in the corporate/enterprise and result in a new set of vendors in the marketplace.  

 
A 2011 report titles “State of Cloud Survey” found that 88% of businesses felt that the cloud would improve their IT agility.  The U.S. Commerce Department’s National Institute of Standards and Technology (NIST) has released papers to foster the federal agencies’ adoption of cloud computing.
 
Many enterprises are rapidly migrating to the cloud to achieve significant cost savings.  Forrester Research recently forecasted that the market for cloud services will grow to more than $241 billion dollars by 2020.
Read more at http://investcloudsolutions.blogspot.com/2012/04/john-wise-introduces-cloudonomics-part.html?view=flipcard

Accounting and Data Warehousing: Common misconception

There is a common misconception between the need to “update your accounting engine” and “create an holistic investment platform”.

10 years ago, the concept of a creating a holistic platform seemed to be too futuristic, and, therefore, upgrading your accounting engine was the de-facto choice.  Much of the decision centered around selecting i) an objected based or non-SQL system (i.e. a “closed architecture” system by nature: this means difficult/complex to develop custom reporting), or ii) a SQL-based system (SQL is an “open architecture” system: this means it is a common skill set and thus easier to customize the system and reports.)

Given the above, most firms choose the “open architecture” SQL based system, if available, given functionality and asset class coverage were the same.

However, today the focus driving a platform design/selection should not be on individual components, such as the accounting engine, but should instead focus on the larger and more important question what the holistic investment platform needs to achieve.
The question is: Do you go with an Accounting-Centric approach or an Information-Centric approach?

1) Accounting-Centric is a 1990’s approach:  The issues are i) accounting engines are not designed to store a wider set of information such as trading, risk, compliance, market fundamentals, or investor/client data; and ii) the data structures that are ideal accounting are the least ideal for reporting.  The historical reaction by accounting vendors has been to create what they wrongly call a data warehouse (technically this is a data mart) of the accounting data in a better format.  These data marts are very limited in the types of data they store and typically ONLY store accounting information. The technical point here is that a DATA MART is a cheap and poor alternative to a DATA WAREHOUSE.

2) The norm today is an Information-Centric architecture:  The benefits are i) all reporting is derived from a single data store (warehouse or repository) that stores all information from the entire trade life cycle above (Trading, Investors, Risk and Compliance…) and ii) the actual data structures within the data warehouse are designed for reporting, and hence provide high degrees of flexibility.

One of the key benefits (driving it to become the norm) of the Information-Centric approach is that the presence of a true data warehouse within the platform renders the underlying reporting capabilities within the accounting platform less significant.  Thus firms can focus on accounting engines that are best-in-class in accounting, and not focus on distracting add-ons, such as a data mart (a half-baked fix).

The final twist is the methods deployed in the data warehouse.  Today, data warehousing has evolved to become a data repository that combines the need for an online operational repository and the needs for flexible and consolidated enterprise reporting.  This remains a core component to an information centric architecture and platform.

Michael is a founding member and COO at InvestCloud.  In his role, Michael is focused on the product development of the world’s first investment management platform in cloud.

Original article at: http://investcloudsolutions.blogspot.com/2012/03/accounting-and-data-warehousing-common.html?view=flipcard