Meet the New Software Analyst

As US equity markets closed out 2013 at new highs, the future of equity research is facing significant change. With “price targets” being reset for many soaring social, cloud and big data analytics stocks let’s meet the new software analyst. But first, a little background.

Equity research has marginally evolved with investment styles and trading strategies over the past couple of decades. The days of primary fundamental research, particularly on the sell-side, faded long ago. Most analysts don’t have the gumption or the time.

Shrinking commissions and heightened regulatory scrutiny yield lower returns on investment, continuing a cycle of reducing research resources. The sell-side analyst role now has three principal components: 1) to provide access to company managements in their existing coverage universe; 2) to provide coverage for companies that are underwriting clients; and, 3) to provide “hot data points” – particularly for handicapping quarterly results. Buy-siders compete for management access and seek to combine these data points with their own findings to feed trading decisions.

Unfortunately, individual data points legally obtained and disseminated rarely move the needle in providing an adequate sample size on which to base an investment, no less a trading decision. For buy-siders, even aggregating data points from numerous analysts covering a particular sector or company does not provide a relevant statistical sample.

Limitations of today’s analytics

For example, let’s say a mid-sized publicly-traded technology company goes to market with a blend of 100 direct sales teams (one salesperson and one systems engineer per team) and 500 channel partners (mixed 75%/25% between resellers and systems integrators). Further, assume that these teams and partners are dispersed in proportion to the company’s 65%/35% sales mix between North America and international. How many salespeople and channel partners would an analyst have to survey to get an accurate picture of the company’s business in any given quarter?

If a typical sell-side analyst covers 15-20 companies (quintuple that for buy-side analysts), the multiplier effect of data points that an analyst would have to touch makes it humanly impossible to gather sufficient information. Moreover, with 50% of most tech company deals closing in the final month of a quarter, of which half often close in the final two weeks of that month, how much visibility can an analyst have?

Further, why would a company’s sales team talk to anyone from the investment community in the final weeks of a quarter when the only people they are interested in speaking with are customers who can sign a deal? Now consider that many companies throughout the supply chain have instituted strict policies in response to recent scandals to prevent any employee from having any contact with anyone from the investment community.

Even the best-resourced analysts lack the tools to correlate the data points he/she does gather to identify meaningful patterns for either an individual company or an entire sector. Finally, with shorter-term investing horizons and high-frequency trading dominating volume, how relevant are these data points anyway?

The big data approach to research

Stocks generally tend to trade on either sector momentum or overall market momentum. Macro news or events are far more likely to impact a sector’s movement, and therefore a stock’s in that sector. This includes volatility around quarterly earnings – which can run 10%-30% for technology stocks – because the majority of “beats” or “misses” are frequently impacted by macro factors. Excuses such as “sales execution” or “product transition” or “merger integration” issues are less frequent than conference calls would suggest. “Customers postponed purchases” or “down-sized deals” or “customers released budgets” or “a few large deals closed unexpectedly” are more likely explanations.

Now, major sell-side and buy-side institutions are trialing new software that leverages cloud infrastructure and big data analytics to model markets and stocks. Massive data sets can include macro news from anywhere in the world, such as economic variables, political events, seasonal and cyclical factors. These can be blended with company-specific events, including earnings, financings or M&A activity. Newer data sources, including social media, GPS and spatial can also be layered into models. Users can input thousands of variables to build specific models for an entire market or an individual security.

As with any predictive analytics model the key is to ask the right questions. However, the machine learning capabilities of the software will allow the system to not only answer queries but to also determine what questions to ask.

The advantages to both sell-side and buy side firms are significant. They include:

  • Lower costs. Firms can avoid major technology investments by leveraging the scale and processing power of cloud-based infrastructure and analytics software. They can collect, correlate and analyze huge, complex data sets and built models in a fraction of the time and cost that it takes in-house analysts to do.
  • Accuracy. Machine learning and advanced predictive analytics techniques are far more reliable and scalable than models built in Excel spreadsheets. Patterns can be detected to capture small nuances in markets and/or between securities that high-frequency trading platforms have been exploiting for years.
  • Competitiveness. The software can make both sell-side and buy-side firms more competitive with the largest, most technologically advanced hedge funds that have custom-built platforms to perform analytics on this scale in real time. In addition to enhancing performance, the software can be leveraged to improve client services by making select tools available to individual investors.

Analysts become data scientists

The analyst skill set must evolve. They will still have to perform fundamental analysis to understand the markets they follow and each company’s management, strategy, products/services and distribution channels. And they will still have to judge whether a company can execute on these factors.

But to increase their value, analysts will have do statistical modeling and use analytics tools to gain a deeper understanding of what drivers move markets, sectors or particular stocks. Data discovery and visualization tools will replace spreadsheets for identifying dependencies, patterns and trends, valuation analysis, and investment decision making. Analysts will also need a deeper understand client strategies and trading styles in order to tailor their “research” to individual clients.

These technologies may well continue to shrink the ranks of analysts because of their inherent advantages. But those analysts who can master these techniques to complement their traditional roles may not only survive, but lift their value – at least until the playing field levels – because of their new alpha-generating capabilities.

WebComp Analyst Review and Awesome Bonus

WebComp Analyst is a new keyword and link analysis tool created by Jonathan Leger. But, what does it do? Can it really help your sites ranked well on the search engines? Or is it another keyword analysis tool where you’re going to use once or twice and never to be used again? Let’s find out in this review.

If you have been doing online marketing for a while now, I’m sure you know that web traffic is one of the most important criteria for success. Every successful website owner knows that without traffic the website is dead! The main reason why more than 95% of web marketers fail is because they don’t get traffic.

It’s obvious that your websites need traffic in order to succeed. And the best source of consistent and high quality traffic is from the search engines. Unfortunately, most websites get little or no traffic from the search engines. That’s where WebComp Analyst comes to your rescue. It’s more than a keyword suggestion and analysis tool, it also tells whether you should or should not target certain keywords or keyword phrases. In other words, it guides you to target the RIGHT keywords (those that worth targeting).

Remember this… targeting certain keywords can give you traffic but NOT all keywords are worth targeting. There are keywords that can give you traffic but don’t have commercial values, meaning that you can not convert those keywords into sales. Therefore, targeting keywords that convert well is KEY to your success.

If you know anything about keywords ranking in Google, then you know that ranking in Google is all about the links. For example: many different websites can target the same keyword but those that have more links with the targeted keywords in the anchor text will come out as winners. You can get this information for free manually but it’ll be very time consuming and tedious. Having to manually check links and anchor text can take hours. With WebComp Analyst, all this information can be obtained in seconds, saving you a huge amount of time.

For those who are involved in researching and building niche sites, WebComp Analyst is worth checking out. It certainly can saves you hours of works. Using this tool can also increase your chance of success. Your niche sites would likely get web traffic if you analyze all the keywords using this tool before building the actual sites. This not only will save you time but also money in the long run.

Another good point I want to highlight in this review is that WebComp Analyst (WCA) only costs $67 one-time. There is no monthly cost or other additional charges.

Is there any bad point?

Honestly speaking, it’s very difficult to find a flaw for this product. However, you should be aware that WCA does not run natively on the Macintosh. If you’re using Mac, you need to install the Mono Framework for Mac OS. That’s the only flaw I can find.

I hope this review is useful for you.

Maximize Your Market Research With Google Trends

It doesn’t take long in the world of Internet marketing before you recognize the importance of good market research when it comes to the health and ultimate success of your business. While you can spend big money hiring people to do market research for you, or spend significant time and energy attempting to conduct it on your own, few methods are more user-friendly and effective than Google Trends.

Google is known worldwide for its useful tools, from its highly popular search engine to Gmail to a number of research and analyst tools. Google Trends is part of this family of free tools, and it tends to receive rave reviews from all users, regardless of the level of experience or type of business the user possesses.

Users of Google Trends claim it greatly improves their methods for conducting research online, which in turn helps them to be more productive and efficient in their businesses. They actually end up saving time and money as they learn about search engine trends on a global level.

How Does Google Trends Work?

Google released the tool to the public in 2004. Its purpose was to give users a method for viewing and monitoring online search results. Users can gain an understanding of how search results shift and change over a period of time. Another benefit to Google Trends is that it provides supporting information and relevant news stories that are related to the particular search term you enter. Instead of having to find the information on your own, it is all presented very neatly in an easy-to-understand format.

For example, after you plug a search term into the tool, it provides a line graph displaying time and search volume, as well as the countries, cities and areas that have searched for your specific term the most. By doing this, you can easily identify the types of products and solutions your target market is seeking. You can also learn exactly what search terms they are using, so that you have an understanding of how to best utilize your SEO tactics.

If you are interested in doing a comparison search between different terms, you can submit up to five words on the same graph by entering the terms separated by a comma. Being able to view the results for each term side-by-side is a great way to determine which words are going to be most powerful in your optimization of websites, articles, blogs and other online content you may use.

Far too many online businesses fail because they leave out the vital step of market and product research. Many who begin to use Google Trends discover that they did not know their target market as well as they thought. By using the tool to determine the best products and SEO terms, you are better able to meet your potential customers’ needs, rather than just throwing products out there hoping people will want them.

Once you witness the power of solid market research and product trending for yourself, you will likely never begin another product launch or website without incorporating them into your plan. The good news is that conducting this research doesn’t have to be difficult or even time consuming. Google Trends provides excellent search engine research with a few clicks of your mouse, helping you to propel your online business to the next level.

Market Research Axioms – If You Remember Anything Remember This

The value of a strong questionnaire design when complemented by the task of high quality sample development is not fully appreciated. Often these two essential building blocks of market research are relegated to the back of the line on research projects.

Research Axiom One: You can never fully recover from a poorly written questionnaire.

o No manipulation of the variables, regardless of how cleverly done
o No amount of analysis, regardless of how brilliant
o No degree of insightful interpretation, regardless of intellectual prowess
Nothing can save you from a poor research foundation. The building will collapse like a house of cards!

If there is one part of the research process that I know, it is questionnaire design. It is a task repeatedly given insufficient time and attention. Clients and research professional alike often underestimate the time it will take to develop a truly well structured and concise instrument.

What amazes me most? Project leaders relegate this task to a status depicted by the attitude of: “Once the questionnaire is done we can get on with the important stuff, like analysis and reporting.” The assumption that analysis work is the essence of the research and the expectation that interpreting the results is where the mastery of research ultimately lies is a mystery to me.

Have we not pounded the concept of garbage-in garbage-out into our heads? Can new internet tools substitute for critical thinking and the hard work of aligning the research instrument to the purpose of the study to answer the business questions that sponsors paid to learn?

If this seems like a bit of a rant, well I guess I am guilty. My own research-on-research including the use of a 25-point questionnaire audit system has shown me that even well healed researchers are less diligent about quality than one would hope. Research is not only science it is a craft [perhaps an art] and if the proper fundamentals are not applied the product is less than artful.

I will end this part of my ranting with an analogy [but don’t be surprised to hear more on this topic]. If you have not studied and then practiced writing poetry, would you expect to publish a book of poems simply because your marketing department asked you to? Designing a good quality research instrument probably takes less talent than being a good poet, but it’s close.

Wait, not so fast, we are not done, there is another mistake from which you cannot fully recover. A poor questionnaire design is one possible fatal mistake, but not the only one. Good solid sample development is also necessary. Here is another Research Axiom worth your consideration.

Research Axiom Two: You can never fully recover from a sample that lacks validity; and once again:

o No manipulation of the variables, regardless of how cleverly done
o No amount of analysis, regardless of how brilliant
o No degree of insightful interpretation, regardless of intellectual prowess
Nothing can save you from a poorly developed sample!

The value of sample development is also underappreciated, as are the skills related to creating a valid sample. Project managers, research analysts, and all those who lose sleep over the quality of the sample sources they have available and who work hard to provide the best possible sample for each research project they conduct, are worth their weight in gold.

With numerous challenges to good sample development always hovering over us, if the research team conducting the study does not pay close attention to this critically important task the chances of deriving useful results are likely to diminish rapidly. One of the worst situations to be in, is standing in front of a room full of executives and presenting the research implications when from off in the far corner an executive vice president (EVP) asks you, “Are you sure about that finding? Who were these respondents? They don’t appear to have any knowledge about the market or our products.”

If you can definitively reply, “We believe the respondents in this sample are qualified” and then give a crisp response about the quality control (QC) steps used to verify the validity of the sample, you have saved the day. If on the other hand, you hesitate and cannot defend the validity of the sample, you have lost your audience – there is nothing more they want to hear from you because in their minds the voice of the respondents do not reflect the people they are trying to reach – the day ends badly.

If you do not care about the quality of the research you conduct, well shame on you, but at least recognize that a sample of good quality is a necessity for self-preservation – enough said.