Skip to main content

WRAP - latest items

WRAP: Warwick Research Archive Portal: No conditions. Results ordered -Date Deposited.

Islamic finance has become an integral part of the financial systems of the Muslim-majority countries of Southeast Asia. At the same time, Southeast Asia has witnessed the emergence of new capital market governance practices and arrangements that are both multi-scalar and multi-sited. This article suggests that rather than only looking at the scale and rescaling of capital market governance in the region, more attention needs to be paid to the shifting balances between regulatory expertise, market practice and societal expectations. Indeed, for governance practices to be considered effective, they have to straddle at times competing demands of authority and legitimacy. This dynamic is nowhere as visible as in the case of Islamic finance, which explicitly involves Sharia experts, trained in Islamic law, in its governance structures. This article explores the novel forms of governance to which this new market has given rise. It argues that Islamic finance – rather than the product of privately held beliefs – has become increasingly bound up with the state apparatus. This facilitates the embedding of Islamic financial principles and ethical concerns throughout capital markets in the region. Yet, Islamic finance has also become increasingly submerged within national development and competitiveness agendas.

A simple 1H and 13C NMR spectrometric analysis is demonstrated that permits differentiation of isoleucine and allo-isoleucine residues by inspection of the chemical shift and coupling constants of the signals associated with the proton and carbon at the α-stereocentre. This is applied to the estimation of epimerisation during metal-free N-arylation and peptide coupling reactions.

The present study assessed the suitability of the Compulsive Exercise Test (athlete version; CET-A) for identifying female athletes with clinically significant features related to or comparable with eating psychopathology. Three hundred and sixty-one female athletes (including 12 with a clinically diagnosed eating disorder) completed the Eating Disorders Examination Questionnaire (EDE-Q) and the CET-A. Receiver Operating Curve (ROC) analysis was employed to identify a cut-off value on the CET-A which could indicate clinically significant features related to or comparable with eating psychopathology among female athletes. The analysis demonstrated that a CET-A score of 10 successfully discriminated female athletes with a current eating disorder. The results suggest that the CET-A may be a suitable tool for detecting eating psychopathology in female athletes. Additional longitudinal research is needed to evaluate the predictive value of the CET-A.

Fundamental to many predictive analytics tasks is the ability to estimate the cardinality (number of data items) of multi-dimensional data subspaces, defined by query selections over datasets. This is crucial for data analysts dealing with, e.g., interactive data subspace explorations, data subspace visualizations, and in query processing optimization. However, in many modern data systems, predictive analytics may be (i) too costly money-wise, e.g., in clouds, (ii) unreliable, e.g., in modern Big Data query engines, where accurate statistics are difficult to obtain/maintain, or (iii) infeasible, e.g., for privacy issues. We contribute a novel, query-driven, function estimation model of analyst-defined data subspace cardinality. The proposed estimation model is highly accurate in terms of prediction and accommodating the well-known selection queries: multi-dimensional range and distance-nearest neighbors (radius) queries. Our function estimation model: (i) quantizes the vectorial query space, by learning the analysts' access patterns over a data space, (ii) associates query vectors with their corresponding cardinalities of the analyst-defined data subspaces, (iii) abstracts and employs query vectorial similarity to predict the cardinality of an unseen/unexplored data subspace, and (iv) identifies and adapts to possible changes of the query subspaces based on the theory of optimal stopping. The proposed model is decentralized, facilitating the scaling-out of such predictive analytics queries. The research significance of the model lies in that (i) it is an attractive solution when data-driven statistical techniques are undesirable or infeasible, (ii) it offers a scale-out, decentralized training solution, (iii) it is applicable to different selection query types, and (iv) it offers a performance that is superior to that of data-driven approaches.

Chapter 1 is an overview of the thesis in which I explain why work on housing markets merits attention, discuss two broad questions that motivated the research, emphasise the particular avenue I have chosen to pursue, and summarise the new insights to be learned. I also include a short discussion on the methodologies that are used.

In Chapter 2, I introduce information heterogeneity into a user-cost house pricing model. I use the model to shed light on two empirical regularities in the housing market: the predictability of housing return and the positive relationship between rent volatility and housing prices. The model also has predictions on overpricing and housing price excess volatility.

In Chapter 3, I study a Real Business Cycle model with borrowing constraints and incomplete information. I show that in such an environment noises in signals may have real impacts on the macroeconomy; the effects are induced by learning and amplified and propagated by the collateral effects. Noises may generate sizeable and persistent fluctuations on consumption, credit, asset price, and output.

In Chapter 4, I implement a new strategy to identify shocks that drive the co-movements between housing price and consumption. My results show that, in the United Kingdom, productivity shocks and especially news shocks about future productivity explain most of the co-movements. I also show that more than half of the changes in housing price growth were not related to the changes in consumption growth, which casts doubt on the importance of housing wealth effects on consumption.

Deriving meaningful insights from location data helps businesses make better decisions. One critical decision made by a business is choosing a location for its new facility. Optimal location queries ask for a location to build a new facility that optimizes an objective function. Most of the existing works on optimal location queries propose solutions to return best location when the set of existing facilities and the set of customers are given. However, most businesses do not know the locations of their customers. In this paper, we introduce a new problem setting for optimal location queries by removing the assumption that the customer locations are known. We propose an optimal location predictor which accepts partial information about customer locations and returns a location for the new facility. The predictor generates synthetic customer locations by using given partial information and it runs optimal location queries with generated location data. Experiments with real data show that the predictor can find the optimal location when sufficient information is provided.

H-Benzo[cd]pyrene ('Olympicene′) is a polyaromatic hydrocarbon and non-Kekulé fragment of graphene. A new synthetic method has been developed for the formation of 6H-benzo[cd]pyrene and related ketones including the first time isolation of the unstable alcohol 6H-benzo[cd]pyren-6-ol. Molecular imaging of the reaction products with scanning tunnelling microscopy (STM) and non-contact atomic force microscopy (NC-AFM) characterised the 6H-benzo[cd]pyrene as well as the previously intangible and significantly less stable 5H-benzo[cd]pyrene, the fully conjugated benzo[cd]pyrenyl radical and the ketones as oxidation products.

Enantiomerically-enriched trichloromethyl-containing alcohols, obtained by asymmetric reduction, can be transformed regioselectively into 1-substituted piperazinones by modified Jocic reactions with little or no loss of stereochemical integrity. This methodology can be easily used to synthesise important pharmaceutical compounds such as the fluorobenzyl intermediate of a known PGGTase-I inhibitor.

Stereo- and chemodivergent enantioselective reaction pathways are observed upon treatment of alkylarylketenes and trichloroacetaldehyde (chloral) with N-heterocyclic carbenes, giving selectively either β-lactones (up to 88:12 dr, up to 94 % ee) or α-chloroesters (up to 94 % ee). Either 2-arylsubstitution or an α-branched iPr alkyl substituent within the ketene favours the chlorination pathway, allowing chloral to be used as an electrophilic chlorinating reagent in asymmetric catalysis.

The advent of modern High Performance Computing (HPC) has facilitated the use of powerful supercomputing machines that have become the backbone of data analysis and simulation. With such a variety of software and hardware available today, understanding how well such machines can perform is key for both efficient use and future planning. With significant costs and multi-year turn-around times, procurement of a new HPC architecture can be a significant undertaking.

In this work, we introduce one such measure to capture the performance of such machines – analytical performance models. These models provide a mathematical representation of the behaviour of an application in the context of how its various components perform for an architecture. By parameterising its workload in such a way that the time taken to compute can be described in relation to one or more benchmarkable statistics, this allows for a reusable representation of an application that can be applied to multiple architectures.

This work goes on to introduce one such benchmark of interest, Hydra. Hydra is a benchmark 3D Eulerian structured mesh hydrocode implemented in Fortran, with which the explosive compression of materials, shock waves, and the behaviour of materials at the interface between components can be investigated. We assess its scaling behaviour and use this knowledge to construct a performance model that accurately predicts the runtime to within 15% across three separate machines, each with its own distinct characteristics. Further, this work goes on to explore various optimisation techniques, some of which see a marked speedup in the overall walltime of the application. Finally, another software application of interest with similar behaviour patterns, PETSc, is examined to demonstrate how different applications can exhibit similar modellable patterns.

This thesis interrogates the relationship between the researcher and immersive media ecology through developing an immersive-participatory method which builds on autoethnography and makes central the researcher's experience. It theorises immersion during gameplay as an affective, embodied state, which is both active and passive, and achieved via visual engagement, projective identification, and haptic communication with the player character and game world. Deploying a mode of qualitative content analysis alongside this immersive method, this project makes visible and utilises the relationship between the researcher and the object of study. As such, it attains critical purchase on the affective and embodied experience of narrative, immersive and mechanic game elements.

Three overarching themes have emerged from this research: the affective and persuasive elements of immersive engagement; the players' ability to agentfully negotiate the freedoms and constraints of the gameworld; and the ideological positioning of the player within certain subjectivities. In order to examine these themes, I interrogate three narrative and mechanical branches which are common to the games studied. Firstly, how trauma, vulnerability and spectacle are deployed within game narratives and structures, and how they serve to attach the player and motivate them to overcome and master. Secondly, the way (bio)dystopian game worlds and mechanical incentivisation of accumulation work at cross purposes to both express anxieties about late-capitalist ideologies whilst also structuring player desire along neoliberal lines. And finally, the evocation of ethical response through 'moral' game mechanics and the space for players to interpret, negotiate, and play with ethical acts. In following these lines of analysis, this thesis reveals broader cultural tensions surrounding identification, immersion, and knowledge specifically regarding questions of affect, desire, and ethical decision-making.

This paper argues that the convention of allocating donated gametes on a "first come, first served" basis should be replaced with an allocation sys-tem that takes into account more morally relevant criteria than waiting time. This conclusion was developed using an empirical bioethics method-ology, which involved a study of the views of eighteen staff members from seven UK fertility clinics, and twenty academics, policy-makers, representa-tives of patient groups, and other relevant professionals, on the allocation of donated sperm and eggs. Against these views, we consider some nu-anced ways of including criteria in a points allocation system. We argue that such a system is more ethically robust than "first come, first served", but we acknowledge that our results suggest that a points system will meet with resistance from those working in the field. We conclude that criteria such as a patient's age, potentially damaging substance use, and parental status should be used to allocate points and determine which patients re-ceive treatment and in what order. These and other factors should be ap-plied according to how they bear on considerations like child welfare, pa-tient welfare, and the effectiveness of the proposed treatment.

Whilst not a 'new' pest in the UK, Aleyrodes proletella has become an increasing problem for the Brassica industry in recent years, especially on Brussels sprout and kale. The reason for the increasing problem is unknown, but it is believed to be due to a combination of climate change, removal of certain active ingredients from use and later harvest times of crops. Relatively little research has focused on this species as, historically; it has been regarded as a minor pest. Knowledge about the biology of A. proletella is limited and some of what is currently understood about its ecology has been inferred from anecdotal evidence.

The overall aim of this project is to understand population trends of A. proletella in the most vulnerable crops, Brussels sprout and kale. This includes understanding the key times of population increase and colonisation of new crops. This information can then be used to inform the development of an integrated control strategy using insecticides and other tools, which might include biological control agents and methods of cultural or physical control.

Experiments to investigate the vertical and horizontal distribution of flights by A. proletella showed that A. proletella performs mainly low, short distance flights throughout most of the year and it is these flights that are mostly responsible for colonisation of new vulnerable crops, which can be achieved by overwintering females early in the season. Monitoring of field populations on kale has shown that populations of whitefly develop without regulation by predators or parasitoids, with increases in numbers mostly determined by the development of further generations from the initial immigrants to the crop. The size of a population of A. proletella that can be achieved within a crop seems to be governed by the number of generations that can develop before the onset of diapause in September, which prevents further reproduction. A new fungal pathogen, which causes epizootics within the field, has been observed. This killed up to >90% of adult A. proletella. Of all potential natural enemies this pathogen had the largest potential to reduce A. proletella infestations and offers potential for the development of a new method of biological control.

Recent research has provided evidence that mood can spread over social networks via social contagion, but that, in seeming contradiction to this, depression does not. Here, we investigate whether there is evidence for the individual components of mood (such as appetite, tiredness and sleep) spreading through US adolescent friendship networks while adjusting for confounding by modelling the transition probabilities of changing mood state over time. We find that having more friends with worse mood is associated with a higher probability of an adolescent worsening in mood and a lower probability of improving, and vice versa for friends with better mood, for the overwhelming majority of mood components. We also show, however, that this effect is not strong enough in the negative direction to lead to a significant increase in depression incidence, helping to resolve the seeming contradictory nature of existing research. Our conclusions, therefore, link in to current policy discussions on the importance of subthreshold levels of depressive symptoms and could help inform interventions against depression in high schools.

Aim:
Software use is ubiquitous in the species distribution modelling (SDM) domain; nearly every scientist working on SDM either uses or develops specialist SDM software; however, little is formally known about the prevalence or preference of one software over another. We seek to provide, for the first time, a 'snapshot' of SDM users, the methods they use and the questions they answer.

Location:
Global.

Methods:
We conducted a survey of over 300 SDM scientists to capture a snapshot of the community and used an extensive literature search of SDM papers in order to investigate the characteristics of the SDM community and its interactions with software developers in terms of co-authoring research publications.

Results:
Our results show that those members of the community who develop software and who are directly connected with developers are among the most highly connected and published authors in the field. We further show that the two most popular softwares for SDM lie at opposite ends of the 'use-complexity' continuum.

Main conclusion:
Given the importance of SDM research in a changing environment, with its increasing use in the policy domain, it is vital to be aware of what software and methodologies are being implemented. Here, we present a snapshot of the SDM community, the software and the methods being used.

1.The rapid growth of species distribution modelling (SDM) as an ecological discipline has resulted in a large and diverse set of methods and software for constructing and evaluating SDMs. The disjointed nature of the current SDM research environment hinders evaluation of new methods, synthesis of current knowledge and the dissemination of new methods to SDM users.

2.The zoon r package aims to overcome these problems by providing a modular framework for constructing reproducible SDM workflows. zoon modules are interoperable snippets of r code, each carrying a SDM method that zoon combines into a single analysis object.

3.Rather than defining these modules, zoon draws modules from an open, version-controlled online repository. zoon makes it easy for SDM researchers to contribute modules to this repository, enabling others to rapidly deploy new methods in their own workflows or to compare alternative methods.

4.Each workflow object created by zoon is a rerunnable record of the data, code and results of an entire SDM analysis. This can then be easily shared, scrutinised, reproduced and extended by the whole SDM research community.

5.We explain how zoon works and demonstrate how it can be used to construct a completely reproducible SDM analyses, create and share a new module, and perform a methodological comparison study.

En affirmant l'intérêt d'une « sociologie du langage », nous souhaitons interroger l'identité propre, les problématiques et les frontières de ce champ. Trois objectifs traversent donc cette contribution : passer en revue différentes tentatives de fondation, définir les défis et problématiques de ce champ aujourd'hui et finalement donner une idée de possibles cadrages de l'analyse à travers deux exemples de terrains, auxquels est appliqué ce programme de recherche. Si notre cartographie commence par discuter les approches macro et micro aux États-Unis, au Royaume-Uni, en Allemagne et en France (première partie), elle ne manquera pas de constater les limites d'une vision binaire qui oppose les niveaux macro et microsociologiques et montrera comment il convient d'aller au-delà, en mettant l'accent sur les pratiques langagières (deuxième partie). Nous illustrons cette proposition en présentant deux exemples de recherches sociologiques dans la troisième partie : le premier touchant aux pratiques d'accompagnement des demandeurs d'emploi comme pratiques de production de nouvelles formes d'individuation contemporaines (Glady), le second portant sur la recherche académique comme une pratique discursive de positionnement des chercheurs (Angermuller). Dans ces travaux, nous partons de l'idée que le social se réalise à travers les pratiques langagières et que les pratiques langagières sont des pratiques sociales. Nous démarquant des linguistes formalistes, nous nous passons de la notion de « langue » comme un système abstrait de règles et de possibilités grammaticales. Nous adoptons l'idée que le langage est une activité humaine qui puise dans un stock de ressources linguistiques pour réaliser les pratiques sociales.

Explosions in homogeneous reactive mixtures have been widely studied both experimentally and numerically. However, in practice, combustible mixtures are usually inhomogeneous and subject to both vertical and horizontal concentration gradients. There is still very limited understanding of the explosion characteristics in such situations. The present study aims to investigate deflagration to detonation transition (DDT) in such mixtures. Two cases in a horizontal obstructed channel with 30% and 60% blockage ratios filled with hydrogen/air mixture with vertical concentration gradients are numerically studied. These cases were experimentally investigated by Boeck et al. (2015), and hence some measurements are available for model validation. A density-based solver within the OpenFOAM CFD toolbox is developed and used. To evaluate the convective fluxes contribution, the Harten–Lax–van Leer–Contact (HLLC) scheme is used for shock capturing. The compressible Navier–Stokes equations with a single step Arrhenius reaction are solved. The numerical results are in good qualitative and quantitative agreement with the experiments. The predictions show that the overpressure at the DDT transition stage is higher in the non-uniform mixtures than that in homogeneous mixtures under similar conditions. It is also found that increasing the blockage ratio from 30% to 60% resulted in faster flame propagation and lower propensity to DDT. The Baroclinic torque and the resulting Richtmyer–Meshkov (RM) instability are also analyzed in relation to flame acceleration and DDT.

Growth in demand for Liquefied Natural Gas (LNG) has increased calls for further research and development on LNG production and safer methods for its transportation. This paper presents the implementation of numerical models for dispersion of evaporated LNG in the open atmosphere. The developed model incorporates in its formulation LNG spill and pool formation into a source model. It is then coupled with a Computational Fluid Dynamics (CFD) approach in OpenFOAM for dispersion calculations. Atmospheric conditions such as average wind speed and direction were used to resolve wind boundary layers. The model also accounts for the humidity effect and its influence on air-density and buoyancy change. Verifications have been conducted using the experimental results from Maplin Sands series of tests by comparing the maximum evaporated gas concentration in every arc in relation to the release point. The results show good agreements between the model's predictions and experiments.

Hyperlink 
Hyperlink 

Contact

Envelope  publications at warwick dot ac dot uk


Quick links