# Research articles for the 2020-05-31

arXiv

The standard design of soccer penalty shootouts has received serious criticism due to its bias towards the team that kicks the first penalty. The rule-making body of the sport has decided in 2017 to try an alternative mechanism. Although the adoption of the new policy has stalled, academic researchers have recently suggested some other designs to improve fairness. This paper offers an extensive overview of seven such soccer penalty shootout mechanisms, one of them first defined here. Their fairness are analysed in three different mathematical models of psychological pressure. We also consider the probability of reaching the sudden death stage, as well as the complexity and strategy-proofness of the designs. Some rules are found to be inferior as they do not lead to a substantial gain in fairness compared to simpler mechanisms. Our work has the potential to impact decision-makers who can save resources by choosing only theoretically competitive designs for field experiments.

arXiv

We argue for the application of bibliometric indices to quantify the intensity of competition in sports. The Euclidean index is proposed to reward quality over quantity, while the rectangle index can be an appropriate measure of core performance. Their differences are highlighted through an axiomatic analysis and several examples. Our approach also requires a weighting scheme that allows us to compare different positions. The methodology is illustrated by studying the knockout stage of the UEFA Champions League in the 16 seasons played between 2003 and 2019: club and country performances as well as three types of competitive balance are considered. All results are remarkably robust concerning the bibliometric index and the assigned weights. Inequality has not increased among the elite clubs and between the countries, however, it has changed within some national associations.

arXiv

We extend the model of Parenti (2018) on large and small firms by introducing cost heterogeneity among small firms. We propose a novel necessary and sufficient condition for the existence of such a mixed market structure. Furthermore, in contrast to Parenti (2018), we show that in the presence of cost heterogeneity among small firms, trade liberalization may raise or reduce the mass of small firms in operation.

arXiv

Cryptocurrencies are a digital medium of exchange with decentralized control that renders the community operating the cryptocurrency its sovereign. Leading cryptocurrencies use proof-of-work or proof-of-stake to reach consensus, thus are inherently plutocratic. This plutocracy is reflected not only in control over execution, but also in the distribution of new wealth, giving rise to "rich get richer" phenomena. Here, we explore the possibility of an alternative digital currency that is egalitarian in control and just in the distribution of created wealth. Such currencies can form and grow in grassroots and sybil-resilient way. A single currency community can achieve distributive justice by egalitarian coin minting, where each member mints one coin at every time step. Egalitarian minting results, in the limit, in the dilution of any inherited assets and in each member having an equal share of the minted currency, adjusted by the relative productivity of the members. Our main theorem shows that a currency network, where agents can be members of more than one currency community, can achieve distributive justice globally across the network by joint egalitarian minting, where each each agent mints one coin in only one community at each timestep. Equality and distributive justice can be achieved among people that own the computational agents of a currency community provided that the agents are genuine (unique and singular). We show that currency networks are sybil-resilient, in the sense that sybils affect only the communities that harbour them. Furthermore, if a currency network has a subnet of genuine currency communities, then distributive justice can be achieves among all the owners of the subnet.

arXiv

The tick size, which is the smallest increment between two consecutive prices for a given asset, is a key parameter of market microstructure. In particular, the behavior of high frequency market makers is highly related to its value. We take the point of view of an exchange and investigate the relevance of having different tick sizes on the bid and ask sides of the order book. Using an approach based on the model with uncertainty zones, we show that when side-specific tick sizes are suitably chosen, it enables the exchange to improve the quality of liquidity provision.

arXiv

The shortcomings of the popular Black-Scholes-Merton (BSM) model have led to models which could more accurately model the behavior of the underlying assets in energy markets, particularly in electricity and future oil prices. In this paper we consider a class of regime switching time-changed Levy processes, which builds upon the BSM model by incorporating jumps through a random clock, as well as randomly varying parameters according to a two-state continuous-time Markov chain. We implement pricing methods based on expansions of the characteristic function as in \cite{Fourier}. Finally, we estimate the parameters of the model by incorporating historic energy data and option quotes using a variety of methods.

arXiv

The objective of the paper is to price weather contracts using temperature as the underlying process when the later follows a mean-reverting dynamics driven by a time-changed Brownian motion coupled to a Gamma Levy subordinator and time-dependent deterministic volatility. This type of model captures the complexity of the temperature dynamic providing a more accurate valuation of their associate weather contracts. An approximated price is obtained by a Fourier expansion of its characteristic function combined with a selection of the equivalent martingale measure following the Esscher transform proposed in Gerber and Shiu (1994).

arXiv

In this paper we present the impact of alternative data that originates from an app-based marketplace, in contrast to traditional bureau data, upon credit scoring models. These alternative data sources have shown themselves to be immensely powerful in predicting borrower behavior in segments traditionally underserved by banks and financial institutions. Our results, validated across two countries, show that these new sources of data are particularly useful for predicting financial behavior in low-wealth and young individuals, who are also the most likely to engage with alternative lenders. Furthermore, using the TreeSHAP method for Stochastic Gradient Boosting interpretation, our results also revealed interesting non-linear trends in the variables originating from the app, which would not normally be available to traditional banks. Our results represent an opportunity for technology companies to disrupt traditional banking by correctly identifying alternative data sources and handling this new information properly. At the same time alternative data must be carefully validated to overcome regulatory hurdles across diverse jurisdictions.

arXiv

Scenario reduction techniques are widely applied for solving sophisticated dynamic and stochastic programs, especially in energy and power systems. We propose a new method for ensemble and discrete scenario reduction based on the energy distance which is a special case of the maximum mean discrepancy (MMD). We discuss the choice of energy distance in detail, especially in comparison to the popular Wasserstein distance which is dominating the scenario reduction literature. The energy distance is a metric between probability measures which allows for powerful tests for equality of arbitrary multivariate distributions or independence. Thanks to the latter, it to a suitable candidate for ensemble and scenario reduction problems. The theoretical properties and considered examples indicate clearly that the reduced scenario set tend exhibit better statistical properties for energy distance than a corresponding reduction with respect to the Wasserstein distance. We show applications to binary trees and a real data based examples for electricity demand profiles.

arXiv

Tax evasion is the illegal evasion of taxes by individuals, corporations, and trusts. The revenue loss from tax avoidance can undermine the effectiveness and equity of the government policies. A standard measure of tax evasion is the tax gap, that can be estimated as the difference between the total amounts of tax theoretically collectable and the total amounts of tax actually collected in a given period. This paper presents an original contribution to bottom-up approach, based on results from fiscal audits, through the use of Machine Learning. The major disadvantage of bottom-up approaches is represented by selection bias when audited taxpayers are not randomly selected, as in the case of audits performed by the Italian Revenue Agency. Our proposal, based on a 2-steps Gradient Boosting model, produces a robust tax gap estimate and, embeds a solution to correct for the selection bias which do not require any assumptions on the underlying data distribution. The 2-steps Gradient Boosting approach is used to estimate the Italian Value-added tax (VAT) gap on individual firms on the basis of fiscal and administrative data income tax returns gathered from Tax Administration Data Base, for the fiscal year 2011. The proposed method significantly boost the performance in predicting with respect to the classical parametric approaches.

arXiv

The high level of risk and uncertainty in harnessing oil and gas reserves poses an accounting dilemma in the reporting of reserves quantity information; information which is critical and relied on by investors for decision making. Different studies have indicated that reserves disclosure information is fundamental to understanding the value of the firm. This study attempts to contribute to the growing value relevance literature on reserves disclosures by examining the value relevance of the components of oil and gas reserve quantity change disclosures of upstream oil and gas companies in the London Stock Exchange. Particularly, it investigates the relationship between average historical share returns and changes in reserves from explorations, acquisitions, production, revisions and sale. It also examines the value relevance of the quality of these disclosures. Using archival data from LSE, databases and annual reports, and applying a multifactor framework, the empirical results suggested that changes in reserves as well as the components of these changes where associated with share returns though insignificantly due to the significant impact of oil price and longitudinal effect posed by applying the measurement approach with utilizes historical returns. However, the quality of reserves disclosures has a positively significant relationship with share returns. The volatility and decline in oil price is also reflected in both low average share returns at -0.4% and low average growth in reserves at 8.94% for the last 8 years in the sector.