Back to Table of contents

Primeur weekly 2018-01-22

Focus

European Commission explains why Joint Undertaking is well suited as legal instrument to help create EuroHPC ecosystem ...

Exascale supercomputing

Exascale architectures lead to greener and more advanced combustion systems ...

Call for Proposals: Aurora Early Science Programme expands to include data and learning projects ...

Quantum computing

HKU quantum physicist Dr. Giulio Chiribella receives Croucher Senior Research Fellowship 2018 ...

New input for quantum simulations ...

Focus on Europe

Eni boots up HPC4 and makes its computing system the world's most powerful in the industry ...

ONERA to install new supercomputer for aerospace research ...

Atos to deliver the most powerful supercomputer in Germany at Forschungszentrum Jülich ...

Hardware

Cray announces selected preliminary 2017 financial results ...

India's Ministry of Earth Sciences deploys new Cray XC40 supercomputers and Cray storage systems ...

University of Virginia Engineering tapped to lead $27.5 million centre to reinvent computing ...

Asperitas creates AsperitasEI business unit to bring circular energy and data centre projects to life ...

CSRA selects edge solutions and Supermicro computer for expansion, increasing NASA computing capacity to 5 petaFLOPS ...

Mellanox ConnectX-5 Ethernet adapter wins Linley Group Analyst Choice Award for Best Networking Chip ...

Notre Dame to lead $26 million multi-university research centre developing next-generation computing technologies ...

New $32 million centre at University of Michigan reimagines how computers are designed ...

New C-BRIC centre will tackle brain-inspired computing ...

Ultra-thin memory storage device paves way for more powerful computing ...

Applications

US DOE announces funding for new HPC4Manufacturing industry projects ...

NOAA kicks off 2018 with massive supercomputer upgrade ...

UMass Center for Data Science partners with Chan Zuckerberg Initiative to accelerate science and medicine ...

Himawari-8 data assimilated simulation enables 10-minute updates of rain and flood predictions ...

Ohio Supercomputer Center to host free webinar on innovative web-based HPC portal ...

2D tin (stanene) without buckling: A possible topological insulator ...

Uncovering decades of questionable investments ...

Groundbreaking conference examines how AI transforms our world ...

Framework for Research Data Management makes life simpler for researchers ...

The Cloud

New centre headquartered at Carnegie Mellon will build smarter networks to connect edge devices to the Cloud ...

IBM and Salesforce strengthen strategic partnership ...

ANSYS and Rescale offer on-demand, pay-per-use ANSYS software on Rescale's ScaleX Cloud HPC platform ...

Uncovering decades of questionable investments


The plot shows the time-series of aggregate lottery demand. Aggregate lottery demand in any month t is measured as the equal-weighted (EWMAX) or value-weighted (VWMAX) average value of MAX across all stocks in the sample in month t. Credit: Murray, Bali, Brown and Tang.
17 Jan 2018 Austin - One of the key principles in asset pricing - how we value everything from stocks and bonds to real estate - is that investments with high risk should, on average, have high returns.

"If you take a lot of risk, you should expect to earn more for it", stated Scott Murray, professor of finance at George State University. "To go deeper, the theory says that systematic risk, or risk that is common to all investments" - also known as 'beta' - "is the kind of risk that investors should care about."

This theory was first articulated in the 1960s by Sharpe (1964), Lintner (1965), and Mossin (1966). However, empirical work dating as far back as 1972 didn't support the theory. In fact, many researchers found that stocks with high risk often do not deliver higher returns, even in the long run.

"It's the foundational theory of asset pricing but has little empirical support in the data. So, in a sense, it's the big question", Scott Murray stated.

In a recent paper in the Journal of Financial and Quantitative Analysis , Scott Murray and his co-authors Turan Bali from Georgetown University, Stephen Brown from Monash University, and Yi Tang from Fordham University, argue that the reason for this 'beta anomaly' lies in the fact that stocks with high betas also happen to have lottery-like properties - that is, they offer the possibility of becoming big winners. Investors who are attracted to the lottery characteristics of these stocks push their prices higher than theory would predict, thereby lowering their future returns.

To support this hypothesis, they analyzed stock prices from June 1963 to December 2012. For every month, they calculated the beta of each stock (up to 5,000 stocks per month) by running a regression - a statistical way of estimating the relationships among variables - of the stock's return on the return of the market portfolio. They then sorted the stocks into 10 groups based on their betas and examined the performance of stocks in the different groups.

"Theory predicts that stocks with high betas do better in the long run than stocks with low betas", Scott Murray stated. "Doing our analysis, we find that there really isn't a difference in the performance of stocks with different betas."

They next analyzed the data again and, for each stock month, calculated how lottery-like each stock was. Once again, they sorted the stocks into 10 groups based on their betas and then repeated the analysis. This time, however, they implemented a constraint that required each of the 10 groups to have stocks with similar lottery characteristics. By making sure the stocks in each group had the same lottery properties, they controlled for the possibility that their failure to detect a difference in performance between in their original tests was because the stocks in different beta groups have different lottery characteristics.

"We found that after controlling for lottery characteristics, the seminal theory is empirically supported", Scott Murray stated.

In other words: price pressure from investors who want lottery-like stocks is what causes the theory to fail. When this factor is removed, asset pricing works according to theory.

Other economists had pointed to a different factor - leverage constraints - as the main cause of this market anomaly. They believed that large investors like mutual funds and pensions that are not allowed to borrow money to buy large amounts of lower-risk stocks are forced to buy higher-risk ones to generate large profits, thus distorting the market.

However, an additional analysis of the data by Scott Murray and his collaborators found that the lottery-like stocks were most often held by individual investors. If leverage constraints were the cause of the beta anomaly, mutual funds and pensions would be the main owners driving up demand.

The team's research won the prestigious Jack Treynor Prize, given each year by the Q Group, which recognizes superior academic working papers with potential applications in the fields of investment management and financial markets.

The work is in line with ideas like prospect theory, first articulated by Nobel-winning behavioural economist Daniel Kahneman, which contends that investors typically overestimate the probability of extreme events - both losses and gains.

"The study helps investors understand how they can avoid the pitfalls if they want to generate returns by taking more risks", Scott Murray stated.

To run the systematic analyses of the large financial datasets, Scott Murray used the Wrangler supercomputer at the Texas Advanced Computing Center (TACC). Supported by a grant from the National Science Foundation, Wrangler was built to enable data-driven research nationwide. Using Wrangler significantly reduced the time-to-solution for Scott Murray.

"If there are 500 months in the sample, I can send one month to one core, another month to another core, and instead of computing 500 months separately, I can do them in parallel and have reduced the human time by many orders of magnitude", he stated.

The size of the data for the lottery-effect research was not enormous and could have been computed on a desktop computer or small cluster - albeit taking more time. However, with other problems that Scott Murray is working on - for instance research on options - the computational requirements are much higher and require super-sized computers like those at TACC.

"We're living in the Big Data world", he stated. "People are trying to grapple with this in financial economics as they are in every other field and we're just scratching the surface. This is something that's going to grow more and more as the data becomes more refined and technologies such as text processing become more prevalent."

Though historically used for problems in physics, chemistry and engineering, advanced computing is starting to be widely used - and to have a big impact - in economics and the social sciences.

According to Chris Jordan, manager of the Data Management & Collections group at TACC, Scott Murray's research is a great example of the kinds of challenges Wrangler was designed to address.

"It relies on database technology that isn't typically available in high-performance computing environments, and it requires extremely high-performance I/O capabilities. It is able to take advantage of both our specialized software environment and the half-petabyte flash storage tier to generate results that would be difficult or impossible on other systems", Chris Jordan stated. "Dr. Murray's work also relies on a corpus of data which acts as a long-term resource in and of itself - a notion we have been trying to promote with Wrangler."

Beyond its importance to investors and financial theorists, the research has a broad societal impact, Scott Murray contended.

"For our society to be as prosperous as possible, we need to allocate our resources efficiently. How much oil do we use? How many houses do we build? A large part of that is understanding how and why money gets invested in certain things", he explained. "The objective of this line of research is to understand the trade-offs that investors consider when making these sorts of decisions."

Source: University of Texas at Austin, Texas Advanced Computing Center - TACC

Back to Table of contents

Primeur weekly 2018-01-22

Focus

European Commission explains why Joint Undertaking is well suited as legal instrument to help create EuroHPC ecosystem ...

Exascale supercomputing

Exascale architectures lead to greener and more advanced combustion systems ...

Call for Proposals: Aurora Early Science Programme expands to include data and learning projects ...

Quantum computing

HKU quantum physicist Dr. Giulio Chiribella receives Croucher Senior Research Fellowship 2018 ...

New input for quantum simulations ...

Focus on Europe

Eni boots up HPC4 and makes its computing system the world's most powerful in the industry ...

ONERA to install new supercomputer for aerospace research ...

Atos to deliver the most powerful supercomputer in Germany at Forschungszentrum Jülich ...

Hardware

Cray announces selected preliminary 2017 financial results ...

India's Ministry of Earth Sciences deploys new Cray XC40 supercomputers and Cray storage systems ...

University of Virginia Engineering tapped to lead $27.5 million centre to reinvent computing ...

Asperitas creates AsperitasEI business unit to bring circular energy and data centre projects to life ...

CSRA selects edge solutions and Supermicro computer for expansion, increasing NASA computing capacity to 5 petaFLOPS ...

Mellanox ConnectX-5 Ethernet adapter wins Linley Group Analyst Choice Award for Best Networking Chip ...

Notre Dame to lead $26 million multi-university research centre developing next-generation computing technologies ...

New $32 million centre at University of Michigan reimagines how computers are designed ...

New C-BRIC centre will tackle brain-inspired computing ...

Ultra-thin memory storage device paves way for more powerful computing ...

Applications

US DOE announces funding for new HPC4Manufacturing industry projects ...

NOAA kicks off 2018 with massive supercomputer upgrade ...

UMass Center for Data Science partners with Chan Zuckerberg Initiative to accelerate science and medicine ...

Himawari-8 data assimilated simulation enables 10-minute updates of rain and flood predictions ...

Ohio Supercomputer Center to host free webinar on innovative web-based HPC portal ...

2D tin (stanene) without buckling: A possible topological insulator ...

Uncovering decades of questionable investments ...

Groundbreaking conference examines how AI transforms our world ...

Framework for Research Data Management makes life simpler for researchers ...

The Cloud

New centre headquartered at Carnegie Mellon will build smarter networks to connect edge devices to the Cloud ...

IBM and Salesforce strengthen strategic partnership ...

ANSYS and Rescale offer on-demand, pay-per-use ANSYS software on Rescale's ScaleX Cloud HPC platform ...