Market data trends for 2012
Oliver Muhr, executive vice president of SunGard’s MarketMap business unit, said: “Economists, equity, fixed income researchers and quant traders need historical data to better understand growth opportunities and validate market positions and trading strategy. This requires not only more data, but more minute and granular information provided in a fast and efficient manner. SunGard offers information management tools that help enterprises filter and curate data for price discovery, financial modeling, risk management and business intelligence.”
The ten market data trends SunGard has identified for 2012 in historical data management are:
Transparency:
1. Firms need more consistent and timely reporting to meet new regulations and investor demands, creating greater strain on data infrastructures that feed risk reporting
2. Risk reports will be required by regulators and investors almost daily, while on-demand data will be needed to meet more advanced analytics
3. Greater transparency in analyzing the relationships between asset classes, such as complex derivatives, is driving the need for standardized entity and security identifiers, and cross symbology
Efficiency:
4. Larger data sets are required to feed predictive models, as more historical data over longer time periods and increased granularity of data sets power back-tests, forecasts and trading impacts throughout the day
5. Firms are focused on controlling variable data costs by centralizing historical data in one location to assess best price
6. Practitioners such as MBAs and CFAs want more flexible data management solutions that require less IT support so that they can spend more time discovering market opportunities
7. With globalization of markets, historical data brings greater complexity in terms of cross-border currencies, valuations and accounting standards – requiring improved accuracy and more market data coverage across assets and regions
Networks:
8. In order to perform advanced analytics and calculations required to support electronic trading strategies, firms must implement platforms that can store greater quantities of data and quickly retrieve and accurately process historical and time series data.
9. Vector storage, rather than traditional relational databases, will be needed to understand complex trends and scenarios
10. Cleaning and storing historical data is driving firms to seek plug-and-play technology that fits with industry standard infrastructures
Paul Rowady, senior analyst at TABB Group, said: “Data management has been, and always will be, an among the most critical components of the quantitative process. It is well known in the quant world that the depth of historical archive – the timeframe of data used for backtesting – is inversely proportional to the turnover of the strategy in question. Therefore, today’s trend toward slower-turnover strategies means that a proportional increase in the scale of the data will be required, as well as the most granular data possible in order to provide maximum flexibility for strategy development today and down the road. In fact, dealing with data at the granular level and in a hands-on environment is paradoxically the most valuable exercise a quant can do to understand subtle market inefficiencies.”

The ESG journey: public honesty for a profitable future
Citywealth asked Amy Blackwell to give an overview on the current situation of ESG investments and programmes, not forgetting about the impact of climate change and greenwashing, focusing on the relationship between investors and advisers.