49
Asia Pacific Productivity Conference 2014 Book of Abstracts ID: 001 The Impact of Phase 1 Compliance with the 1990 Clean Air Act on Total Factor Productivity Growth in the Electric Utility Industry. Authors: Finn Forsund and Gerald Granderson Contact: [email protected] , [email protected] Category: SFA - Applications to Regulation Abstract: Abstract Phase 1 of Title IV of the 1990 Clean Air Act, effective from 1995 to 1999, required 110 power plants to reduce their sulfur dioxide (SO2) emissions to levels relative to their fuel use in the 1985-1987 time period. These same 110 power plants were also required to reduce their nitrogen oxide (NOX) emissions during the time period 1996 to 1999. Firms can reduce SO2 emissions by using less coal (more oil, natural gas, other fuel sources) or lower sulfur content coal, installing equipment (i.e. scrubbers), and reduce NOx emissions by retrofitting power plants with low nitrogen oxide burners or employing modifications that control fuel and air mixing to limit nitrogen oxide emission. Firms could also adopt a new technology that reduces SO2 and NOX emissions. Procedures enacted to reduce SO2 and NOx emissions could (i) reduce total factor productivity (TFP) growth, (ii) increase marginal cost, which could impact the scale economies component of TFP growth, and (iii) impact the rate of technological change. We examine whether having to comply with Phase 1 contributed to a reduction in both TFP growth, and the scale economies component. We also investigate whether the different procedures employed to reduce SO2 emissions compared to reducing NOX emissions had different impacts on the rate of technological change. Phase 1 utilities were given an initial allocation of SO2 permits by the Environmental Protection Agency (EPA). Firms could also purchase additional SO2 permits through a yearly allowance action market. With nitrogen oxide there was not an EPA trading permit program until 1999. Phase 1 utilities most likely had a smaller initial allocation of NOX permits (than SO2 permits), and thus may have needed to alter their production process by a more significant amount to obtain the necessary reductions in nitrogen oxide emissions. We test whether the cost of reducing NOX emissions may be different from the cost of reducing SO2 emissions to comply with Phase 1. ID: 002 A generalised approach to measuring service productivity Authors: Moira Scerri Contact: [email protected] Category: ECR Day Abstract: The contribution of services to the economies of developed and developing nations cannot be underestimated. The service sector now contributes over 85% to Australia’s gross domestic product and employs over 85% of the Australian workforce. However, the transition from agriculture and manufacturing to service and knowledge based industries has been paralleled with decreases in productivity growth rates. The traditional measures of productivity established in a predominantly tangible or goods dominant production environment is being challenged. The applicability to the service sector is questionable given the intangible nature of service and the difficulty in identifying and quantifying key factors contributing to service productivity. The Service Enterprise Productivity in Action (SEPIA) model is developed using Boulding’s nine level general systems hierarchy and is a first attempt at operationalising service productivity. The SEPIA model includes inputs from multiple stakeholders of customers, employees and suppliers. In this exploratory study customer input is seen as a factor of production and a key contributor to service productivity. The customer interface variable is made up of intangible factors such as service complexity, customer interactions, customer loyalty and customer channel as key inputs and customer’s willingness to pay is used as a proxy for value. It is to be noted that this list of measures is not exhaustive, rather, it is an attempt to highlight the importance of intangible factors in the measurement of service productivity. Transaction level data from 14 firms (non-homogenous) operating in the Australian travel and tourism industry is coded using a SEPIA coding schema. The data is then analysed using a DEA input oriented variable return to scale model. The results are presented along with managerial implications, limitations and areas for future research.

Asia Pacific Productivity Conference 2014 Book of … of...Asia Pacific Productivity Conference 2014 Book of Abstracts ID: 001 The Impact of Phase 1 Compliance with the 1990 Clean

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Asia Pacific Productivity Conference 2014

Book of Abstracts

ID: 001 The Impact of Phase 1 Compliance with the 1990 Clean Air Act on Total Factor Productivity Growth in the Electric Utility Industry.

Authors: Finn Forsund and Gerald Granderson Contact: [email protected] , [email protected] Category: SFA - Applications to Regulation Abstract: Abstract Phase 1 of Title IV of the 1990 Clean Air Act, effective from 1995 to 1999, required 110 power

plants to reduce their sulfur dioxide (SO2) emissions to levels relative to their fuel use in the 1985-1987 time period. These same 110 power plants were also required to reduce their nitrogen oxide (NOX) emissions during the time period 1996 to 1999. Firms can reduce SO2 emissions by using less coal (more oil, natural gas, other fuel sources) or lower sulfur content coal, installing equipment (i.e. scrubbers), and reduce NOx emissions by retrofitting power plants with low nitrogen oxide burners or employing modifications that control fuel and air mixing to limit nitrogen oxide emission. Firms could also adopt a new technology that reduces SO2 and NOX emissions. Procedures enacted to reduce SO2 and NOx emissions could (i) reduce total factor productivity (TFP) growth, (ii) increase marginal cost, which could impact the scale economies component of TFP growth, and (iii) impact the rate of technological change. We examine whether having to comply with Phase 1 contributed to a reduction in both TFP growth, and the scale economies component. We also investigate whether the different procedures employed to reduce SO2 emissions compared to reducing NOX emissions had different impacts on the rate of technological change. Phase 1 utilities were given an initial allocation of SO2 permits by the Environmental Protection Agency (EPA). Firms could also purchase additional SO2 permits through a yearly allowance action market. With nitrogen oxide there was not an EPA trading permit program until 1999. Phase 1 utilities most likely had a smaller initial allocation of NOX permits (than SO2 permits), and thus may have needed to alter their production process by a more significant amount to obtain the necessary reductions in nitrogen oxide emissions. We test whether the cost of reducing NOX emissions may be different from the cost of reducing SO2 emissions to comply with Phase 1.

ID: 002 A generalised approach to measuring service productivity Authors: Moira Scerri Contact: [email protected] Category: ECR Day Abstract: The contribution of services to the economies of developed and developing nations cannot be

underestimated. The service sector now contributes over 85% to Australia’s gross domestic product and employs over 85% of the Australian workforce. However, the transition from agriculture and manufacturing to service and knowledge based industries has been paralleled with decreases in productivity growth rates. The traditional measures of productivity established in a predominantly tangible or goods dominant production environment is being challenged. The applicability to the service sector is questionable given the intangible nature of service and the difficulty in identifying and quantifying key factors contributing to service productivity. The Service Enterprise Productivity in Action (SEPIA) model is developed using Boulding’s nine level general systems hierarchy and is a first attempt at operationalising service productivity. The SEPIA model includes inputs from multiple stakeholders of customers, employees and suppliers. In this exploratory study customer input is seen as a factor of production and a key contributor to service productivity. The customer interface variable is made up of intangible factors such as service complexity, customer interactions, customer loyalty and customer channel as key inputs and customer’s willingness to pay is used as a proxy for value. It is to be noted that this list of measures is not exhaustive, rather, it is an attempt to highlight the importance of intangible factors in the measurement of service productivity. Transaction level data from 14 firms (non-homogenous) operating in the Australian travel and tourism industry is coded using a SEPIA coding schema. The data is then analysed using a DEA input oriented variable return to scale model. The results are presented along with managerial implications, limitations and areas for future research.

APPC 2014 July 8 - 11

Book of Abstracts 2 | P a g e

ID: 003 Profit changes in the insurance market under an undesirable output approach, the Mexican case Authors: Ana Maria Reyna Bernal and Dr. Hugo Javier Fuentes Castro Contact: [email protected] , [email protected]. Category: DEA - Other Applications Abstract: Profit changes in the insurance market under an undesirable output approach, the Mexican case Reyna

Bernal, Ana María Fuentes Castro, Hugo Javier Keywords: insurance, productivity, undesirable outputs, data envelopment analysis Abstract Deregulation made through The NAFTA in 1994 allowed the gradually entrance of Canadian and North American insurance companies in the Mexican insurance market. Since then, the number of insurers expanded 72%, however, the participation of the insurance activity in the GNP has grown slowly. What we propose is to examine the performance of the Mexican insurance sector with a model and a methodology adapted to the circumstances of this industry. Claims typically have been considered outputs of insurance companies, notwithstanding it contravenes the association between productivity and profitability when catastrophic events cause losses and high scores of productivity at the same time. Our analytical method differs from this approach as long as we suggest treating claims as undesirable outputs of the insurance industry. We extend Grifell-Tatjè and Lovell (1999) model to measure the magnitude of four components on profit change considering their decomposition into resources, bad and good products effects. Empirical frontiers are estimated from insurance firms grouped according to their specialization applying a partial frontier with bad outputs transformation method.

ID: 005 Playing hide and seek. How to tackle the endogeneity problem in empirical DEA applications? Authors: Gabriela Sicilia and Daniel Santín Contact: [email protected] , [email protected] Category: DEA - Theory Abstract: Non-parametric techniques, and especially data envelopment analysis (DEA), are the most commonly

applied methods for measuring technical efficiency. Selecting the appropriate input and output variables to include in the model is the most critical decision that practitioners will have to undertake in order to obtain reliable efficiency scores. Regarding this point, the literature has identified several issues that can be detrimental to DEA estimates, such as the inclusion of irrelevant variables in the model, the omission of relevant inputs or the presence of correlation between inputs. However, one of the most important current concerns in econometrics, namely, the presence of endogeneity in the production process has received a little attention in the literature and is frequently overlooked when practitioners apply DEA. In a recent work Cordero, Santín and Sicilia (2013) found that DEA estimates could be severely impaired under the presence of positive endogeneity, that is, when at least one input in the production process is positive correlated with the true efficiency. Therefore, two key issues emerge now: how can endogeneity be detected and how can this problem be overcome in DEA applications. These are the aims of the present research. To do this, we use synthetic data generated in a Monte Carlo experiment to compare different endogenous scenarios with an exogenous one. As a result, we propose a simple statistical procedure which allows practitioners to identify the presence of an endogenous input in an empirical application. In addition, we evaluate two potential solutions to deal with this problem in order to improve DEA estimations: to omit the endogenous input and to use an “instrument input”. The results evidence that both options significantly improve DEA estimates and that the final choice of applying one or the other method depends on the specific purpose for which the research is carried out. Finally, we perform an empirical application in the education sector in order to illustrate our theoretical findings.

APPC 2014 July 8 - 11

Book of Abstracts 3 | P a g e

ID: 006 What makes an efficient classroom? Evidence for primary education in Spain Authors: Daniel Santín, José Manuel Cordero and Gabriela Sicilia Contact: [email protected] , [email protected] , [email protected] Category: DEA - Applications to Health & Education Abstract: The aim of our research is to identify the main drivers of technical efficiency in Spanish primary schools. We

are particularly interested in identifying which are the most efficient classrooms and exploring which teaching and educational practices should be promoted in order to improve academic outcomes with the current educational resources allocation. To do this, we perform a semiparametric two stage model (Simar and Wilson, 2007) using the “Evaluación General de Diagnóstico 2009” (EGD) database from the Spanish Ministry of Education. Unlike most of the available international databases the EGD provides data at the classroom level which allows us to explore about the efficiency differences between groups within the same school (with the same school resources, school organization and students background). That is, differences between teachers and the way they work in their classroom. We aim to identify the most efficient classrooms and subsequently analyze which are the techniques and teaching methods that lead to the best results given the same level of school resources and student characteristics. So, in a first stage we use Data Envelopment Analysis (DEA) to estimate efficiency scores, which are then regressed on school and student contextual variables in order to explore the main efficiency drivers. The results show that, on average, Spanish primary schools can improve the academic outcomes in 12%. In addition, we find significant differences from the classrooms efficiency comparison within the same school. In 35.5% (13.5%) of the analyzed groups the teacher and teaching practices explain five to ten efficiency points (more than ten efficiency points). Finally, regarding the main drivers of this efficiency, our findings suggest that the education policy should focus on supporting students with high risk of repetition at an early age, encouraging daily extracurricular reading, promoting classroom teaching techniques involving more individual work by students and less exposure time by teachers and seeking a greater involvement of parents in their children’s education.

ID: 009 Do Investors Value Firm Efficiency? Evidence from the Australian Context Authors: Hai Yen Pham, Richard Chung and Eduardo Roca Contact: [email protected] , [email protected] , [email protected] Category: SFA - Other Applications Abstract: There is an urgent call for Australian companies to improve their productivity and efficiency as Australia is

facing a declining labour force participation due to an aging population. But do investors value improvement in efficiency at all? This paper investigates the relation between the change in firm efficiency and subsequent stock returns. We employ a stochastic frontier analysis (SFA) to evaluate firm efficiency for a large panel of non-financial companies in the Australian securities market from January 1990 to October 2012. Firm efficiency is estimated by comparing a benchmark Tobin’s Q of a hypothetical value-maximizing firm to a firm’s actual Tobin’s Q. The results show that over the sample period, the estimated efficiency score of the average firm is around 61.5% which registered an improvement of 3% per year. We find that an equally-weighted (value-weighted) portfolio of stocks with high change in efficiency outperforms an equally-weighted (value-weighted) portfolio of stocks with low change in efficiency by an average of 11% (7%) per annum during the sample period. In cross-sectional analysis, the change in firm efficiency helps explain variation in cross-section of stock returns even after controlling for known risk factors such as size, book to market, market liquidity, industry concentration and seasonality effect. Further, the cross-sectional regression results by industry reveal that firm efficiency improvement has explanatory power in explaining cross-section of stock returns in six out of nine industries: materials (mining), industrials, consumer discretionary, consumer staples, health care and utilities industries. Thus, investors in the Australian context value improvement in firm efficiency.

APPC 2014 July 8 - 11

Book of Abstracts 4 | P a g e

ID: 010 Is Technological Change Skill Biased? A Global Value Chain Perspective Authors: Marcel Timmer Contact: [email protected] Category: Other Topics Abstract: Technological change in the past decades has been skill-biased. This is a general held opinion amongst

economists, and routinely used as an assumption when theorizing about trade, growth and labour markets. For example, Autor et al. (2008, Review of Economics and Statistics) find technological change complements high-skilled tasks but substitutes routine ones. Baltagi and Rich (2005, Journal of Econometrics) estimate production functions based on Annual Survey of Manufactures, and show a rapid SBTC from 1983 onwards that is related with the diffusion of personal computers. But how strong is the empirical evidence we have for this assumption? In this paper we will argue that current empirical studies are neglecting the fact that production processes fragment across borders (Timmer et al. 2014, Journal of Economic Perspectives). In that case, estimation with country-level data is not valid anymore as offshoring will be observationally equivalent to low-skilled labour saving technology. In this paper we identify vertically integrated production chains. We derive factor cost shares of four factors: low-, medium and high-skilled labour and capital. This is done for 560 manufacturing global value chains covering 40 countries and 14 manufacturing product groups (Timmer et al. 2014). Data is available for 1995-2008 from the World Input-Output Database (www.wiod.org). Assuming a translog production function we estimate a system of 3 cost-share equations and derive estimates of biases in technological change for each factor. Thus we will be able for the first time to answer the question: is technological change really skilled bias?

ID: 012 How to measure a spillover effect of public capital stock: a spatial stochastic frontier model Authors: Deockhyun Ryu, Jaepil Han Robin C.Sickles [email protected] Contact: [email protected] , [email protected] , [email protected] Category: SFA - Applications to Macro Abstract: This paper aims to investigate the spillover effect of public capital stock, mostly infrastructure SOC, in a

production function model that accounts for spatial dependencies. In many settings, ignoring spatial dependency yields inefficient, biased and inconsistent estimates in cross country panels. Although there are a number of studies that estimate the output elasticity of public capital stock they suffer from a failure to refine the output elasticity of public capital stock as well as to account for the spillover effect of the public capital stock on the production efficiency when such spatial dependencies exist. For this purpose we employ a spatial stochastic frontier model and analyze estimates based on a number of specifications of the spatial dependency structure. We use geographic and economic relations to derive our spatial weights matrices. Using data for 22 OECD countries from 1960 to 2001, we estimate a spatial autoregressive stochastic frontier model and derive the mean indirect marginal effects of public capital stock, which are interpreted as spillover effects. In this way we can capture the difference between national level and cross-country level estimates of the output elasticity of public capital stock. Alternatively, we also estimate the spillover effect of public capital stock with the methods introduced by Holtz-Eakin and Schwartz (1995) who considered geographic dependency in the public capital stock. Preliminary results indicate that the spillover effect can be an important factor explaining variations in technical inefficiency across countries as well as a crucial factor in explaining the discrepancies among various levels of output elasticity of public capital stock in more traditional production function approaches.

ID: 013 Measuring the effect of bad loans on bank efficiency: how can we do this best? Authors: David Tripe, Paul Rouse and Wen Qian Song Contact: [email protected] , [email protected] , [email protected] Category: DEA - Applications to Banking Abstract: This study investigates the problem of how to deal with bad loans (as reflected in bad and doubtful debt

expense) as an undesirable output in bank efficiency studies, noting that the literature has identified a number of alternative approaches. It uses New Zealand bank data for the period from March 2006 to March 2012, which also included the global financial crisis, but this provides us with the opportunity to look at significant variations in the levels of bad and doubtful debts. Unsurprisingly, bank efficiency was found to be lowest during the period 2008-2009, when the crisis was having its most profound effects on the New Zealand banking system, with the impact greater on larger banks. It was found that the inclusion of weight restrictions in the DEA analysis accentuated the decline in estimated efficiency during the crisis period. The weight restrictions are also shown to have a realistic interpretation using production tradeoffs between lending policy and interest income.

APPC 2014 July 8 - 11

Book of Abstracts 5 | P a g e

ID: 015 A Semiparametric Stochastic Cost Frontier Model with Time-varying Coefficients Subject to Regularity

Conditions Authors: Kai Sun, Subal Kumbhakar and Ragnar Tveterås Contact: [email protected] , [email protected] , [email protected] Category: SFA - Semi & Non-parametric Approaches Abstract: This paper proposes a semiparametric stochastic cost frontier model where regression coefficients are

unknown smooth functions of time which non-neutrally shift the cost frontier. To guarantee that the estimated smooth coefficients have economically meaningful interpretation, the constraint weighted bootstrapping approach in the statistics literature is employed to impose constraints implied by economic theory on the observation-specific estimates of the coefficients. We then apply the proposed methodology to the Norwegian salmon production and investigate the estimated smooth coefficients, returns to scale, and components of productivity growth.

ID: 017 How well do we account for intangible capital: An exploratory study using trademarks and design rights Authors: Gaetan de Rassenfosse Contact: [email protected] Category: Growth Accounting Abstract: Following Solow (1956) economists have paid particular attention to the role of technological change in

economic growth. They have come to recognise that innovation is a fundamental source of economic growth. An immediate consequence of this is that failure to capitalise R&D investment in systems of national accounts (SNAs) severely distorts estimates of productivity growth. This issue is well acknowledged by economists and is addressed in the 2008 revision of SNAs. This argument has been pushed further by the identification of additional sources of growth. The most prominent such example is marketing and advertising expenditures, which some scholars advocate should also be capitalised (Nakamura 1999; Khan 2001). These scholars base their reasoning on firm-level empirical evidence which shows that ‘branding’ affects economic returns (e.g., Hirschey and Weygandt 1985; Barth et al. 1998). Another example is design expenditures. Scholars in the management literature have argued that design is a strategic asset (Kotler and Rath 1984) that affects company’s performance (Hertenstein et al 2005). Accordingly, economists are broadening further the set of expenses to be treated as investment in SNAs. A popular ‘new growth’ accounting framework is that proposed by Corrado, Hulten and Sichel (CHS). Yet although the CHS framework has been widely used, it has not received the academic scrutiny that it merits. In particular, the methodology for measuring some intangible investment series has not been validated and sometimes amounts to a “rudimentary guess” (Corrado et al 2005:28). Some authors have actually criticised the CHS methodology strongly (Ståhle and Ståhle 2012). This paper proposes a validity test of two of the intangible series in the CHS model: ‘architectural and engineering design’; and ‘brand equity’. The validity test involves assessing the strength of the correlation between the investment series and objective measures of intangible assets. Just as R&D investments are consistently found to be correlated with patent counts, we expect that investment in architectural and engineering design is correlated with the count of registered design rights, and that investment in brand equity is correlated with trademark counts. A significant correlation would provide evidence that the estimated investment series achieve the purpose of capturing intangible assets. The econometric analysis is performed on panel of 32 countries from 1980 to 2010.

APPC 2014 July 8 - 11

Book of Abstracts 6 | P a g e

ID: 018 Productivity Illusion: the Hidden Time Distortion as a Source for Productivity Decrease in Economic

Organizations Authors: Fabian von Schéele and Darek Haftor Linnaeus Contact: [email protected] Category: Bootstrap for DEA or SFA Abstract: Little attention has been put to the problems of the Time Distortion in productivity, where time distortion is

the discrepancy between the objective or physical time and the subjective or cognitive time. Cognitive time distortion is a source of various performance challenges in organisations and its operations, including economic inefficiency, unsatisfactory output quality, and human not well-being. The output-input ratio of productivity may be understood in terms of these two kinds of time. Input that corresponds to physical time and the output that is based time assessments. The current conventions of economic organizations – both in practice and in theory – assume the physical time to be the only kind of time. Based upon findings from biological and mental sciences this assumption of physical time only is challenged. We establish therefore formals relation between the physical time and the cognitive time, and then between that temporal relation and the economic performance of an organization. One implication is that calculus that accounts only for one kind of time, the physical, disregards from errors based on cognitive time assessments by human agents, thereby forming the ground for large errors in performance measurement. An attempt to remedy this is made here by introducing a novel method for the conception and analysis of productivity, where human agent’s cognitive time distortion is accounted for. The formal elaboration of the productivity and efficiency metrics presented here is based upon a workload equation and the conventional total profit equation; however both are formulated to account for empirical findings of the probabilistic and asymmetric character of cognitive time distortion, on individual level as well as on group level. The novelty of the workload equation and the Total Profit Equation comes from its ability to predict of true working time, by introducing new mathematical mechanisms as lever effect between economy, workload and cognitive time distortion. Consequently, we suggest that the fundaments of productivity analysis need to be modified with regard to the measurement of time and value in workload and profit. The here introduced model for productivity analysis may have significant impact for managers.

ID: 020 An enhanced Malmquist-Luenberger index to assess productivity change in the presence of undesirable outputs Authors: Andreia Zanella, Ana S. Camanho and Teresa G. Dias Contact: [email protected] , [email protected] , [email protected] Category: Index Numbers - Applications Abstract: This paper reviewed and discussed the different approaches that can be used to accommodate undesirable

outputs in the analysis of productivity change over time, namely the ratio-based Malmquist-Luenberger (ML) index and the difference-based Luenberger index. The ML index is derived from a standard output oriented Malmquist index, using the relationship between the directional distance function and the Shephard's output distance function. It was shown that an alternative ML index can be derived from the relationship between the directional distance function and the Shephard's input distance function. The two versions of the ML indices represent equally good adaptations of the Malmquist index for assessments involving both desirable and undesirable outputs, although they result in different values. In order to avoid the need to arbitrarily choose one of the measures, we proposed an enhanced version of the ML index, given by the geometric mean of the two alternative versions of the ML indices. This new index, named Average ML index, has the advantage of incorporating both orientations in the computation of the productivity change score, and thus represent more accurately the changes in DMUs' features. It was demonstrated that the Malmquist-Luenberger and Luenberger indices produce consistent and strongly correlated results. The main methodological difference between these indices that can motivate the use of one or another is related to their multiplicative or additive nature. In cases in which there is a preference for ratio-based indices, we believe that the Average ML index proposed in this paper should be used, as it is more precise in estimating productivity change. An empirical example, focusing on the livability of 120 European cities, was used to compare the results obtained by the different versions of the ML indices with the results of the Luenberger index. The results suggested that the Average ML index is more aligned with the Luenberger index than the other two versions of the ML index.

APPC 2014 July 8 - 11

Book of Abstracts 7 | P a g e

ID: 021 Educational systems under scrutiny: an application using robust nonparametric frontier methods Authors: Jose Manuel Cordero Ferrera, Daniel Santin, Gabriela Sicilia and Rosa Simancas Contact: [email protected] , [email protected] , [email protected] Category: DEA - Applications to Health & Education Abstract: This paper uses a fully nonparametric framework to estimate relative performance of educational systems in 35

countries using data from students in the fourth grade of primary education. This study represents an unusual enterprise since most of the empirical research in the field is restricted to evaluations at regional or national level and focused on secondary education. However, the recent publication of the results of the Progress in International Reading Literacy Study (PIRLS) and Trends in International Mathematics and Science Study (TIMSS) for 2011 makes it possible to carry out an international comparison along these lines. We adapt the metafrontier framework to compare the technical efficiency of units operating in heterogeneous contexts, which in our case is represented by different educational systems. Likewise, we use a technique inspired by the non-parametric Free Disposal Hull (FDH) and the application of partial order-m models, which allow us to mitigate the influence of atypical and outlying observations as well as the problem of different group sizes for each country. In addition, we adapt our technology of production to the presence of both desirable outputs of educational achievement and undesirable outputs of educational inequality as output measures. The results obtained allow us to identify different strategies used by countries participating in the study to improve their educational performance. In particular, some of them are focused on improving their final average test scores, while other countries attempt to reduce the gap between the best and worst performers.

ID: 025 Estimating performance of rail transport by bootstrap data envelopment analysis Authors: Erwin Lin Contact: [email protected] Category: Bootstrap for DEA or SFA Abstract: Rail transport has long played an important role in the economic development for a country and enhancement

of its operating efficiency is a crucial issue to be sustainable in a competitive context. Many researchers have endeavored to rail transport performance measurement by data envelopment analysis (DEA) in the past several decades, however, DEA has been criticized for not taking into account statistical noise and lacking any hypothesis testing. This study applies Bootstrap DEA (BDEA) method proposed by Simar and Wilson (1998) to estimate the technical efficiency and confidence intervals for some selected European Union member states’ 27 rail companies in the year of 2010. The empirical results indicate that using DEA method ten (twelve) railways are evaluated as efficient, and the average technical efficiency is 0.493 (0.613) based on constant (variable) returns to scale technology. While using BDEA, none of 27 railways is evaluated as efficient and the counterparts of efficiencies are 0.396 and 0.431, respectively, implying that the DEA method overestimates the efficiencies of railways under study. The results also indicate that the 90% confidence intervals of average efficiency estimated by BDEA are (0.357, 0.524), from which one can obtain the outputs that each railway should expand, so as to be efficient. Furthermore, the result also suggests that null hypothesis of constant returns to scale cannot be accepted. Finally, based on the results some strategies for improving railways’ performance and possible avenues of future research are proposed.

APPC 2014 July 8 - 11

Book of Abstracts 8 | P a g e

ID: 026 Measuring school demand in the presence of spatial dependence. A conditional approach Authors: Laura López-Torres and Diego Prior Contact: [email protected] , [email protected]. Category: ECR Day Abstract: FOR CONSIDERATION FOR INCLUSION IN THE ECR DAY Improving educational quality is an important public

policy goal. However, success requires identifying factors associated with student achievement. At the core of these proposals is the principle that increased public school quality can make school system more efficient, resulting in correspondingly stronger performance by students. Nevertheless, public educational system is not devoid of competition which arises, among other factors, through the management efficiency and the geographical location of schools. Moreover, households in Spain appear to choose one school through location. In this environment, the objective of this paper is to analyze whether geographical space has an impact on the relationship between the level of technical quality of public schools (measured by the efficiency score) and the school demand index. To do this, an empirical application is performed on a sample of 1,695 public schools in the region of Catalonia (Spain). The effects of spatial autocorrelation on the estimation of the parameters and how these problems are addressed through spatial econometrics models are shown. The results confirm that space plays a moderating effect on the relationship between efficiency and school demand. Nonetheless, such impact only occurs in urban municipalities.

ID: 027 The Players’ Efficiency Monitoring Systems for Match-Fixing Events in CPBL Authors: Win Bei Lin Contact: [email protected] Category: Bootstrap for DEA or SFA Abstract: Baseball is the most popular sports in Taiwan. However, there have been frequent match-fixing scandals

from the top levels of the professional baseball league, which has hurt the feelings of baseball fans. This study intends to explore the match-fixing scandals in Chinese Professional Baseball League and the efficiency patterns to the match-fixing among players. We used data envelopment analysis to explore the players’ efficiency, and the exponentially weighted moving average to verify whether there was a change in the efficiency value of players over the years. A single week was used as a time point, so there were a total of 342 time points for the infielders involved in three match-fixing scandals and a total of 161 time points for the pitchers involved in two match-fixing scandals. According to the detection pattern for the efficiency anomalies of the players involved in match-fixing scandals, the detection accuracy rate for the three match-fixing scandals was 74%, and the overall detection accuracy for the match-fixing scandals in 1996 was 72%. Moreover, the detection accuracy rate for efficiency anomalies of the infielders involved in match-fixing scandals in 2008 reached 100%. No efficiency anomalies were detected for the infielders for 2005, and no efficiency anomaly was detected for the pitchers for 2008. An assessment system was established to continuously evaluate the performance of players in order to monitor the efficiency anomalies of players, hoping to prevent match-fixing scandals from occurring.

ID: 028 Measuring Bank Branch Growth Efficiency and Growth Trends with a new DEA Model structure and

Malmquist-like Techniques Authors: Alex LePlante Contact: [email protected] Category: DEA – Applications to Banking Abstract: The banking industry plays an integral role in today’s society, offering a wide range of essential products

and services that our communities and businesses rely on. To remain competitive amongst their peers, banks must continually grow their business and attract new clients; a task that has become increasingly difficult with the current market volatility and continuous entry of new competitors. To aid in this effort, banks have turned to performance analysis to help identify underperforming areas, cut costs and improve efficiency. Amongst the frontier efficiency analyses identified in the literature, Data Envelopment Analysis (DEA) was found to be one of the leading approaches. DEA has been successfully applied in many bank branch performance evaluations using traditional intermediation, profitability and production approaches. However, there has been little focus on assessing the growth potential of individual branches. This study proposes a new DEA model structure to assess the growth realized by a branch between two specified points in time. This is achieved by incorporating resource and product data from two periods into one model. The results of the model are verified to ensure functionality and accuracy. The analysis is then extended using 6 consecutive years of data producing 5 distinct growth frontiers. A malmquist-like means of frontier comparison is then developed to quantify the frontier shift and DMU catch up observed from one period to the next. These newly developed methodologies are applied to an extensive panel data set consisting of over 1000 DMUs provided by one of Canada’s top 5 banks.

APPC 2014 July 8 - 11

Book of Abstracts 9 | P a g e

ID: 029 Ratio Analysis for Performance Evaluation: The Case of Vietnamese Banks Authors: Dang Thanh Ngo, David Tripe and Claire Matthews Contact: [email protected] Category: Other Topics Abstract: This paper examines the current situation and performance of 12 Vietnamese banks for the period 2003-

2010 using ratio analysis, a traditional but important method in banking and finance performance evaluation. Ratios are selected following the six categories of the CAMELS rating system, namely Capital adequacy, Asset quality, Management quality, Earnings ability, Liquidity, and Sensitivity to market risks. Our finding suggests that during the 2003-2010 periods, the Vietnamese banking sector was soundly developed with increasing capital base, management quality and liquidity, while bad loans and mismatch gaps were decreased. However, their overall performance was on a decreasing trend, in which the State-owned banks were the worst performers. After the global financial crisis 2007-2008, the system started to show some fragility (bad loans started to rise, liquidity of State-owned banks fell but their mismatch remained high, etc.) suggesting that more monitoring and provisions are needed. In addition, the equitization (or privatization) of the State-owned banks should be speeded up and assets size control should be taken into account in order to improve the performance of Vietnamese banks.

ID: 030 Decomposing Productivity Growth with Unbalanced Panel Data and DEA: Application on the OECD banking

systems Authors: Dang Thanh Ngo, David Tripe and Claire Matthews Contact: [email protected] Category: DEA - Applications to Banking Abstract: A popular approach for measuring productivity changes through time is using the Malmquist Index (MI) and

Data Envelopment Analysis (DEA). The MI-DEA approach, however, faces difficulties in case of unbalanced panel data. In this situation, researchers normally either omit the missing data or impose artificial data to create a balanced panel and thus, the examined results would be biased. The current paper seeks for an alternative way to deal with unbalance panel data using a Fisher TFP index (FI). Although price information, which is required to calculate the FI measures, is normally unknown, shadow prices on input and outputs variables could be estimated via DEA. Once the (shadow) prices are measured, the FI-DEA could be calculated and decomposed into the technical, scale, and allocative efficiency changes. Empirical application on the productivity changes of the banking system in 20 OECD countries (1979-2009) proved that the FI-DEA approach is (i) simple to calculate from the data without any information bias; (ii) satisfies both the axiomatic and economic approaches; (iii) reliable and relevant to the MI-DEA approach; and (iv), it also has the power of decomposition.

ID: 032 A meta-regression analysis of frontier efficiency estimates from Africa Authors: Kolawole Ogundari Contact: [email protected] Category: DEA - Applications to Agriculture & Fisheries Abstract: The paper attempts to shed light on whether efficiency estimates of African agriculture improve or decline

and what drives the efficiency levels from three decades (1984-2013) of research on productivity of African agriculture. We employ meta-regression analysis (MRA) on a total of 442 studies resulting in 612 farm level efficiency estimates for the analysis, given that some studies reported more than one estimates. The studies cut across all the regions in the continent with 30 countries represented. Taken together, the overall mean farm level efficiency estimates of about 0.68 was obtained from the case studies, which indicates that there is scope to improve efficiency level of African agriculture as about 32% cost saving could be achieved if agricultural production is on the frontier level in the region. The results of MRA show that mean efficiency estimates of African agriculture from the primary studies decrease significantly as year of survey in increases. Apparently, this implies that over the years, negative efficiency change characterized development of African agriculture and food production. Also, the results of other study attributes considered in MRA other than the survey years, show that studies published in journal, with parametric method and with panel data significantly report higher mean efficiency estimates, while studies with a focus on grain crops and with higher degree of freedom significantly yield lower efficiency estimates. The variation in reported efficiency estimates is significantly lower among countries from the Eastern and Central Africa regions, compared to studies from Western Africa region taken as reference. Other results identify key drivers of efficiency levels of African agriculture and food production over the years to be education, years of experience, extension, credit, farm size and membership of cooperative society.

APPC 2014 July 8 - 11

Book of Abstracts 10 | P a g e

ID: 034 Technical efficiency of Russian plastic and rubber production firms Authors: Аnatoly Peresetsky and Irina Ipatova Contact: [email protected] , [email protected] Category: SFA - Other Applications Abstract: Technical efficiency of Russian plastic and rubber production firms In the paper technical efficiency of

Russian plastic and rubber production firms in 2006–2010 is estimated by SFA. It is demonstrated that increasing firm size will cause increase in its efficiency and also there is increasing returns to scale in the sector. This result is robust for various specifications of the production function and SFA models. Cross-section and panel data models with heteroscedasticity are estimated for Cobb-Douglas and translog specifications of the production function. Formula of technical efficiency marginal effects was derived in case of the exponentially distributed error term. Autocorrelation of efficiency estimates is considered as a measure of their persistency.

ID: 035 Flexible Productivity Analysis Authors: Gholamreza Hajargasht Contact: [email protected] Category: SFA - Semi & Non-parametric Approaches Abstract: Since the seminal work of Aigner, Lovell and Schmidt (1979) a wide range of stochastic frontier models have

been proposed to incorporate various features such as environmental variables, heterogeneity, temporal variations in efficiencies, etc. In recent years, there have also been attempts [mainly by applying local nonparametric methods such as kernel] to estimate SF models avoiding some of the restrictive parametric assumptions in SF models. The purpose of this paper is to introduce what I am calling flexible productivity analysis (FPA here after). This approach relies on low-ranked penalized splines to model nonparametric elements in the model. We argue for and try to demonstrate the following benefits from applying this approach: 1- It allows us to incorporate nonparametric elements into almost any existing SF models in the literature without making its estimation substantially more difficult. 2- It has the feature that researcher can specify any part of the model as flexible as he/she wants [or the data allows] under a unified framework. One can choose a fully parametric part or a fully nonparametric part or anything in between and estimate the model under a similar framework. 3- We focus on additive with interaction models for modelling technology part of the model which provides a good compromise between flexibility and tractability. 4- It is also possible to impose restrictions such as monotonicity on the nonparametric elements of the model without much difficulty. A number of simulated and empirical examples are provided to demonstrate these features.

ID: 038 Nonparametric Least Squares Methods for Stochastic Frontier Models Authors: Valentin Zelenyuk, Leopold Simar and Ingrid van Keilegom Contact: [email protected] Category: SFA - Theory Abstract: When analyzing productivity and efficiency of firms, stochastic frontier models are very attractive because

they allow, as in typical regression models, to introduce some noise in the Data Generating Process. Most of the approaches so far have been using very restrictive fully parametric models, both for the frontier function and for the components of the stochastic terms. Recently, local MLE approaches were introduced to relax these parametric hypotheses. However, the high computational complexity of the latter makes them difficult to use, in particular if bootstrap-based inference is needed. In this work we show that most of the benefits of the local MLE approach can be obtained with less assumptions and involving much easier, faster and numerically more robust computations, by using nonparametric (or local polynomial) least-squares methods. Our approach can also be viewed as a semi-parametric generalization of the so-called "modied OLS" that was introduced in the parametric setup. Our method is illustrated and compared with other methods through some simulated data and with a real data set. We also discuss the vital issue of explaining variation in inefficiency levels with respect to explanatory variables with minimal assumptions on the Data Generating Process.

APPC 2014 July 8 - 11

Book of Abstracts 11 | P a g e

ID: 039 Be happy, be productive: the relationship between productivity and well-being in Western Europe Authors: Charles-Henri DiMaria and Chiara Peroni Francesco Sarracino Contact: [email protected] Category: DEA - Other Applications Abstract: This article aims at setting up a test procedure to uncover the nature of subjective well-being, as measured

by life satisfaction, in production (input or output) at aggregate level. The methodology relies on data envelopment analysis and the measurement of total factor productivity and efficiency. For a sample of European countries results show that subjective well-being is an input and not an output.

ID: 041 Production efficiency and capacity utilisation of the Moreton Bay Sydney rock oyster industry Authors: Peggy Schrobback, Sean Pascoe and Louisa Coglan Contact: [email protected] , [email protected] , [email protected] Category: DEA - Applications to Agriculture & Fisheries Abstract: The Sydney rock oyster (SRO) industry is located on Australia's east coast and is one of the country's oldest

farming industries. This industry has been affected by a range of challenges over the past decades, which includes reoccurring disease outbreaks, the management of food security, biodiversity and environmental degradation risks, severe weather events and increased market competition from Australia's Pacific oyster industry. SRO production appears to be particularly challenged in the Queensland's Moreton Bay, the northern most cultivation area of the industry. Today, it is unclear whether this situation is due to oyster farmer's business choices, personal traits or whether environmental conditions in the Moreton Bay limit the economic capacity of the oyster industry in this region. In this study we assess the economic capacity and capacity utilisation of the Queensland SRO industry using cross-sectional time-series data. We use the conventional production input factors, labour and capital, for a multi-output orientated data envelopment analysis (DEA). In a second-stage analysis we estimate the inference of oyster lease owners demographic traits and environmental conditions at production sites on the derived efficiency and capacity scores. The results provide information about different efficiency measures and an indication of the potential industry output that could be produced in Moreton Bay if production capacity is fully utilised under given production conditions. This knowledge will form the basis for a discussion about the optimal allocation of territorial access rights for Queensland's SRO industry.

ID: 043 Private versus State-owned Banks: A Comparison of Technical Efficiency in the Vietnamese Banking Sector Authors: Thanh Phuong Le, Charles Harvie, Amir Arjomandi and Zhiming Cheng Contact: [email protected] , [email protected] , [email protected] , [email protected] Category: ECR Day Abstract: This paper uses bootstrapped Data Envelopment Analysis (DEA) models to compare the technical efficiency

of private and state-owned banks in Vietnam after the country’s accessing to the WTO in 2007 (2007-2012). It is found that state-owned banks are more efficient in terms of aggregate technical efficiency than private banks using both the intermediation and operating approaches. Using the Li (1996) test we find that the primary cause of this is greater capability of state-owned banks to access productive frontier than private banks. This can be due to the fact that deficiencies in the institutional environment have undermined the sustainable and consistent development of private banks.

APPC 2014 July 8 - 11

Book of Abstracts 12 | P a g e

ID: 044 Firm dynamics and productivity growth: a case study of Australian Manufacturing and Business Services Authors: David Hansell and Thai Nguyen Contact: [email protected] , [email protected] Category: Growth Accounting Abstract: Competitive markets foster the reallocation of inputs where resources are channelled from less competitive

to more competitive firms, and hence increase aggregate productivity. The turnover of firms entering and exiting industries is part of this competitive process as entering firms vie for market shares and exiting firms cease consuming inputs. There is a large body of theoretical and empirical work on firm dynamics, yet to date very few large scale studies have been conducted in Australia due to limited access to firm-level data. This study uses a large panel of businesses, drawn from administrative data provided to the Australian Bureau of Statistics (ABS), which allows us to track firms over the nine years from 2002-03 to 2010-11. Using this comprehensive panel we examine the productivity of firms in Manufacturing and Business Services and, in particular measure the contribution of entry and exit to aggregate productivity growth. We find that exiting firms not only have low productivity in the year prior to exit, but low productivity relative to established firms up to eight years prior. Entrants grow most rapidly in their second year of operation, but after five years are still ten per cent below the level of established firms. At the division level, the main driver of productivity growth is continuing firms, and the net impact of firm turnover is modest. However, among the studied industries, net entry can be significant - a fact masked by the higher level of aggregation. Over the nine year period, entry lowered aggregate productivity growth by 13 per cent in Manufacturing and 23 per cent in Business Services as entrants were less productive than continuing firms. In contrast, the exit of less productive firms raised productivity by 12 per cent in Manufacturing, and 23 per cent in Business Services.

ID: 045 Use of a prototype linked employer-employee dataset for productivity analysis Authors: Joseph (Chien-Hung) Chien and Andreas Mayer Contact: [email protected] Category: Other Topics Abstract: This study uses a prototype linked employer-employee dataset (LEED) to analyse both employee and firm

characteristics of firms to identify the factors that explain the differences in the firm productivity. We created the micro data by linking de-identified Personal Income Tax data from the Australian Taxation Office with the ABS Business Longitudinal Dataset (2009-10 & 2010-11). The aim is to demonstrate the analytical potential of the prototype LEED, which captures employer and employee dynamics, to conduct longitudinal micro level productivity analysis across different sectors of the economy. Ultimately there is the potential to examine how including the business cycle and other macro-economic variables will improve the model.

ID: 047 Impact of carbon emissions on airline productivity: An analysis of airline operational performance under

joint production of good and bad outputs Authors: Boon L. Lee, Clevo Wilson, and Carl A. Pasurka Jr. Contact: [email protected] , [email protected] , [email protected] Category: DEA - Other Applications Abstract: This study employs a two-stage analysis measuring and quantifying sources of productivity and efficiency of

airlines under joint production of good and bad outputs. In the first stage, productivity is measured using both Malmquist and Malmquist-Luenberger total-factor productivity (TFP) indices. The results show that pollution abatement activities under the Malmquist-Luenberger approach lower productivity growth which suggests that the Malmquist approach, which ignores CO2 emissions, overstates the “true” productivity growth. The results also show productivity growth higher amongst low cost carriers than mainstream airlines largely attributed to rising competition via improvements in best-practice management and operations. The reliability of TFP scores are tested and verified using confidence intervals based on Simar and Wilson’s (1999) bootstrapping approach which provides statistical inference. Second stage analysis focuses on determining sources of (in)efficiency using the generalised method of moments.

APPC 2014 July 8 - 11

Book of Abstracts 13 | P a g e

ID: 049 Australian university productivity growth and shadow prices for university teaching and research Authors: Roger Carrington, Prasada Rao and Chris O'Donnell Contact: [email protected] , [email protected] , [email protected] Category: DEA - Applications to Health & Education Abstract: AUSTRALIAN UNIVERSITY PRODUCTIVITY GROWTH AND SHADOW PRICES FOR UNIVERSITY TEACHING AND

RESEARCH Roger Carrington Prasada Rao Chris O’Donnell The University of Queensland Centre for Efficiency and Productivity Analysis St Lucia, QLD. 4072 Australia Abstract Productivity growth estimates for Australian universities are sensitive to the choice of productivity index but not to output measures for teaching. Data envelopment methods are used to decompose the Malmquist and Färe-Primont indexes. These indexes suggest that annual productivity growth for universities over 2005 to 2010 was negative 0.05 per cent and 1.3 per cent, respectively. Improvements in scale and mix efficiency were an important source of productivity growth for universities over recent years and this explains the discrepancy in the productivity growth estimates produced by the two indexes. The Färe-Primont index can decompose productivity growth into scale and mix efficiency but the Malmquist index cannot. Restricted price information for undergraduate teaching and research limits public judgments on the relative importance of these university outputs. To fill this information gap shadow prices for teaching and research are calculated using an input distance function, which is estimated by stochastic frontier analysis. The shadow prices suggest that the larger research-intensive universities have a comparative advantage in teaching science students and undertaking research. Key words: higher education, productivity growth, shadow prices JEL classifications: I21, I23

ID: 050 Performance Analysis of Primary Schools in Australia: A Case Study of Brisbane Schools Authors: Boon L. Lee, Vincent Hoang and Clevo Wilson Contact: [email protected] , [email protected] ,.au [email protected] Category: DEA - Applications to Health & Education Abstract: Since 2008-09, substantial government funding has been allocated in reforming Australia’s education

system. During the last four years more than $540 million has been invested in the Smarter Schools National Partnership for Literacy and Numeracy with a view to facilitate and reward the implementation of evidence-based strategies that improve student literacy and numeracy skills. Furthermore, $550 million was invested in the Smarter Schools National Partnership for Improving Teacher Quality over five years to support states and territories to improve the quality of the Australian teaching workforce. Although these projects are recent initiatives, it is still important to assess the impact such funding has had on school performance in the early stages of the project. To this end, we attempt to study the efficiency of a sample of 175 selected primary schools (private and public) participating schools within a 20 kilometer radius of the Brisbane City Council for the period 2008-2011 using (1) Data evenlopment analysis (DEA) followed by Simar and Wilson’s (2007) bootstrap truncated regression and (2) the fixed effect stochastic frontier analysis. Both approaches have gained wide-recognition in their theoretical exposition to generate results that are more reliable than previous methods. Recent DEA studies that have employed Simar and Wilson’s approach include Alfonso and Aubyn 2006; Barros and Assaf 2009; Alexander et al. 2010; and Barros and Garcia-del-Barrio 2011. In the first stage, bootstrapped data envelopment analysis (DEA) model is employed to estimate the technical efficiency of the 175 schools. In the second stage, the bootstrap DEA scores are regressed against a set of performance-based variables using a truncated regression analysis based on maximum likelihood method. Determining how these explanatory variables impact on efficiency estimates helps to shed light on the sources of inefficiency. In terms of the stochastic frontier analysis, the frontier and determinants of the variations in efficiency performance will be estimated simultaneously. Recent models have been developed and used widely in empirical studies in the literature (Battese and Coelli 1995; Greene 2005).

APPC 2014 July 8 - 11

Book of Abstracts 14 | P a g e

ID: 055 Complementarities Matter: A Search for Patterns of Effective IT-use at the Individual Level Authors: Natallia Pashkevich, Darek M. Haftor and Fabian Von Sheele Contact: [email protected] Category: ECR Day Abstract: This work presents research in progress in terms of a conceptual model for the understanding of IT-driven

productivity at the individual level. The existence of the IT productivity paradox at different economic levels had been a concern for many IS researchers. Since evidence demonstrates that IT, in fact, increases productivity at the macro-, meso- and micro- level, current research attention shifted to the individual and task level of IT-driven productivity. In this regard, recent studies have applied different theoretical foundations to investigate the relationship between IT-use and performance effect at the individual level in post adoption contexts. Since the last decade, the idea that there is a need for a set of organizational factors to be changed in a synchronized fashion, when introducing new IT, has received a particular attention from researchers (e.g. Bartel et al., 2007; Tambe et al., 2012; Aral et al., 2012; Brynjolfsson & Milgrom, 2012). These studies have addressed the need for the joint adaptation of IT-use and innovative human resource management practices to enhance productivity gains at the firm, establishment and business unit performance levels. To investigate these proposals, we have designed a new research model aimed to research productivity growth when IT is deployed, jointly and in a synchronized manner, with both individual capital and organizational capital factors, at the individual level. The latter’s aims to advance further our understanding of the underlying mechanisms of information worker productivity at the individual level. An understanding of the patterns of effective IT-use together with other factors may help determine where research and managerial efforts have to be concentrated in order to enhance individual productivity of information workers. Key words: IT productivity paradox, complementarity, patterns, IT-driven productivity, individual level

ID: 056 The Many Decompositions of Productivity Change (2014 version) Authors: Bert M. Balk Contact: [email protected] Category: Index Numbers - Theory Abstract: Ten years ago, at the first Asia-Pacific Productivity Conference (APPC) held in Australia, I presented the first

version of this paper. For a number of reasons the paper never has been updated, nor did I make any attempt at publication. Thus it seems appropriate, at the occasion of the second APPC in Australia, to incorporate the developments of the last ten years in a new version of the paper, which then provides a survey of the current situation. Productivity change between a base period and a comparison period must be measured by a function that exhibits proportionality in input and output quantities. In the absence of prices, natural candidates are the Malmquist indexes conditional on the base or comparison period cone technology, or a geometric average thereof. These indexes have the property of transitivity, which is the reason why they can be decomposed into independent components. The decompositions can be obtained in a systematic way by considering the various paths in input and output quantity space that connect the base period data point with the comparison period data point. It turns out that each productivity index can be decomposed as a product of five independent factors, relating to efficiency change, technological change, radial scale effect, input mix effect, and output mix effect. These decompositions are, however, not unique. The Malmquist index conditional on the base period cone technology admits six different decompositions, as does the index conditional on the comparison period cone technology. The geometric average index even admits eighteen different decompositions. By merging either the input mix effect or the output mix effect with the radial scale effect, one obtains two different decompositions which are symmetric in all variables. The middle part of the paper discusses the Moorsteen-Bjurek productivity index, defined as a ratio of Malmquist output and input quantity indexes. Although this index exhibits the proportionality property, it turns out that, due to its intransitivity, decomposition into meaningful components is impossible. The final part of the paper considers price- and share-weighted productivity indexes. This is the natural place to discuss a number of O’Donnell’s contributions.

APPC 2014 July 8 - 11

Book of Abstracts 15 | P a g e

ID: 057 Modeling Dynamic Rational Inefficiency Authors: Monica Roa Contact: [email protected] Category: Other Topics Abstract: The literature related to efficiency focuses considerable attention on defining and measuring efficiency

levels, with relatively less emphasis on the sources driving inefficiency. In the line of Bogetoft and Hougard (JPA 2003), we propose a model in which inefficiency is a rational decision, as the firm moves below the frontier because the gains of inefficiency exceeds the costs of improving inefficiency. We present a model in which the firm faces an imperfect factor market and as a result the DMU has to reorganize the input vector requirement, this behavior induces the firm to an apparent waste of resources. Instead, this potential improvement is a reorganizational rational decision among the units to adapt the firm to the less costly efficiency path in the long run. The focus in this paper is on the role of deposits in financial institutions. Deposits play a dual role in banking; they can be inputs for the production of bank loans, or outputs (safekeeping and checking services) provided to the depositors. In the former case, banks pay interest on deposits and in the latter depositors pay for the services. The net difference can be positive or negative depending on banking market power, depositors’ characteristics and factor market structure. The input market is characterized as monopsonistic competition, while the output market is viewed as Bertrand competition. The case study is the Colombian financial system from 1995 to 2012 including banks, financial corporations and leasing companies. The data set contains intermediation and non-intermediation services, employees by units and location, number of offices in each city, and the borrowing and lending rates by financial company.

ID: 059 A New Look at the Decomposition of U.S. Agricultural Productivity Growth: Are Climatic Effects Important? Authors: Eric Njuki, Boris Bravo-Ureta and Christopher J. O’Donnell Contact: [email protected] , [email protected] , [email protected] Category: ERC Day Abstract: Climatic factors are important in agricultural output but this issue has not been addressed explicitly in the

econometric analysis of total factor productivity growth (TFP). This article addresses this gap in the literature and makes two contributions: 1) It utilizes a TFP index that satisfies key axiomatic and economic-theoretic approaches to constructing index numbers; and 2) It uses this index to evaluate TFP change in U.S. agriculture while incorporating climatic effects. The argument is that for any given input levels, output levels as well as TFP will be affected by changes in climatic variables. Therefore, the contribution of this article is to isolate these climatic effects on TFP by decomposing productivity into climatic effects, technological progress, technical efficiency, and scale efficiency changes. The TFP index employed in this article is a special case of an index that corresponds to a Cobb-Douglas output distance function that was first proposed by O’Donnell (2012b). In the absence of technological change and climatic or environmental effects, the output and input change components are indexes similar to those found in Färe and Primont (1995). Hence, we refer to this index as the Färe-Primont-O’Donnell index. This index satisfies the basic economic and axiomatic properties, and is multiplicatively complete. Using state level data for the period 1960 to 2004 from the Economic Research Service of the U.S. Department of Agriculture, we address the following research questions: 1) What are the key drivers of productivity growth in the face of climatic effects? 2) What has been the impact of climatic variables on TFP growth in the U.S. agriculture? 3) To what extent has irrigation and expenditures on research and development, counteracted the adverse effects of climatic variability? Preliminary results indicate that when climatic variables are factored in, TFP growth has averaged 0.95% annually, which is considerably lower than what has been reported in previous studies. Furthermore, technological progress is the key factor driving TFP growth in U.S. agriculture. Temperature and irrigation have a significantly positive effect on agricultural output, whereas changes in precipitation have a negative but statistically insignificant effect. The climatic effect component, which combines temperature and precipitation, contributed positively to TFP growth in 10 southern states, and negatively in the rest of the 38 contiguous states in the U.S.

APPC 2014 July 8 - 11

Book of Abstracts 16 | P a g e

ID: 060 Technical Efficiency and Unobserved Heterogeneity for a Cross Section of Wine Grape Producers: A Final

Look Authors: Boris Bravo-Ureta, Víctor Moreira, Javier Troncoso and Alan Wall Contact: [email protected] , [email protected] , [email protected] , [email protected] Category: SFA - Applications to Agriculture & Fisheries Abstract: Studies focusing on technical efficiency (TE) in wine grape production are limited despite the growing

interest in this farming sector and the popularity of efficiency analysis in agriculture. We extend earlier work (Bravo-Ureta et al. 2012) by expanding the range of models utilized to incorporate agro-ecological conditions (warm and cool), an important consideration in grape production. We estimate stochastic production frontiers (SPF) at the block level accounting for farm level heterogeneity, which is assumed to reflect overall farm management. We utilize the following models: 1) Ordinary Least Squares; 2) Fixed Effects; 3) traditional SPF; 4) Random Effects; 5) True Fixed Effects; 6) True Random Effects (TRE); and 7) TRE-Mundlak Transformation. The data comes from a sample of 38 producers for 2005-06. Each farm in the sample has several blocks and each block has specific management. The total number of blocks/observations is 263. The grapes are classified according to quality as Premium and Varietal. All farms are located in Central Chile and distributed, from North to South, on the following 10 valleys: Limarí, Aconcagua, Casablanca, San Antonio, Maipo, Cachapoal, Rapel, Colchagua, Curicó and Maule. The dependent variable is grape production (Kgs). The explanatory variables are LABOR (expenditures); CAPITAL (expenses on machinery); PHYTO (expenditures on fertilizers and pesticides); a series of dummy variables taking the value 1 for vines older than 5 years, red grapes, premium grapes, single cordon, double cordon, pergola, cool agro-climatic zone, and valleys. All monetary variables are expressed in US$. Results show that when individual blocks within a farm are treated as independent observations, the average TE scores are lower than those estimated when it is recognized that blocks belonging to a particular farm are subject to an overall central (farm-level) management. This points to the importance of accounting for unobserved heterogeneity when repeated observations from individual farms are available on a cross-section. Regional location was also found to matter for grape production. Moreover, agro-climatic conditions were found to be influential on production levels where grape farms located on cooler zones produce significantly less than their counterparts in warmer zones. Overall, our estimations show the importance of accounting for both farm heterogeneity and agro-climatic conditions when analyzing TE in grape production.

ID: 061 Effect of the access to ICT on outputs in Chinese agriculture: An empirical analysis using panel data Authors: Shamsul Arifeen Khan Mamun and Mohammad Mafizur Rahman Contact: [email protected] Category: Production Theory Abstract: Researches into the effect of the access to information and communication technology (ICT) on

manufacturing and service sector firms are voluminous, but the effects of ICT on agriculture sector firm are under-researched both in the developed and developing countries. The current paper investigates the effects of the access to telephone on agricultural productivity in the People’s Republic of China. The motivation behind this study is that in the 1960s agricultural output increased dramatically worldwide due to the ‘Green revolution’ which was featured by the increasing use of conventional agricultural inputs. It is anticipated that a new revolution based on information and communication technologies (ICTs) might take place in agriculture worldwide again in the near future. In this regards, the concept of precision agriculture technology has emerged already in the developed countries. However, the developing countries lack preparation in this area which might be due to lack of a clear understanding of the potential benefits of the use of ICT in agriculture. A disaggregated agriculture-specific study study might be useful to unfold the full potentiality of the concept of precision agriculture. A dynamic Cobb-Douglas production function and a panel data of 29 provinces covering the period 1994–2000 are used in this study. The research evidence of the paper is that the effect of the access to telephone on agricultural productivity is positive and statistically significant. The estimated short-run output elasticity of the access to ICT is 0.086.Our estimated elasticity in agriculture is greater than the output elasticity of the non-agricultural and IT related service firms.The results are robust to additional control variables. Thus, the paper recommends that precision agriculture technology and agricultural-friendly ICT policy has potential for increasing agriculture productivity in the developing countries.

APPC 2014 July 8 - 11

Book of Abstracts 17 | P a g e

ID: 062 Exploring the technical efficiency of dairy farms in Victoria Authors: Bethany Burke, Dr Emayenesh Seyoum –Tegegn and Dr Francis Karanja Contact: [email protected] , [email protected] , [email protected] Category: SFA - Applications to Agriculture & Fisheries Abstract: This study explores the importance of technical efficiency in productivity growth for dairy farms in Victoria.

We aim to build on the results of stochastic frontier analysis (SFA) by highlighting improvements in technical efficiency as a means of boosting sluggish productivity growth. SFA is used for identifying the best performing farms, and thereby the gap between these best performers and inefficient farms, and then for explaining the reasons for the gap. This is of particular use to policy makers in light of the Victorian government’s aim of doubling food and fibre production by 2030. We use the Victorian Department of Environment and Primary Industries (DEPI) Dairy Industry Farm Monitor Project (DIFMP) data, consisting of seven years of unbalanced panel data (from 2006–07 to 2012–13), on 145 dairy farms (491 observations in total). Technical inefficiency effects were found to be highly significant in influencing milk production. The technical inefficiency effects were found to be a linear function of different farm specific factors. The mean estimated technical efficiency was 0.88, ranging between 0.42 and 0.98, which indicates that some dairy farmers have high levels of technical inefficiency in their production processes. This suggests that there may be significant potential for these farmers to increase productivity. Keywords: dairy, stochastic frontier, technical efficiency

ID: 063 Technologies, Markets and Behaviour: Some Implications for Estimating Efficiency and Productivity Change Authors: Chris O'Donnell Contact: [email protected] Category: Other Topics Abstract: Most productivity indexes can be exhaustively decomposed into measures of technical change and

efficiency change. Estimating these components usually involves the use of data envelopment analysis (DEA) or stochastic frontier analysis (SFA) models. This paper shows how assumptions concerning technologies, markets and firm behaviour can be used to frame these models. The paper explains that the assumptions underpinning common DEA models are rarely, if ever, true. On the other hand, the assumptions underpinning basic SFA models are almost always true. The parameters of basic SFA models can be estimated using ordinary least squares and two-stage least squares methods. More complex SFA models can be estimated using maximum likelihood methods. Unfortunately, the assumptions underpinning some of these more complex models are generally not true. This has important implications for estimating the drivers of productivity change. To illustrate, the paper uses common least squares and maximum likelihood methods to estimate the drivers of productivity change in U.S. agriculture. As expected, the different estimators lead to qualitatively different estimates of the efficiency change components productivity change.

ID: 065 Agricultural Productivity, Efficiency and Growth in a Semi-Arid Country: Case Study of Botswana, 1979-2011 Authors: Mr Omphile Temoso, Dr David Hadley and Renato Villano Contact: [email protected] , [email protected] , [email protected] Category: Index Numbers - Applications Abstract: To improve the welfare of rural poor and to decrease the dependence of the national economy on minerals,

the government of Botswana has been spending 40% of the value of agricultural GDP on agricultural support services. Despite this massive investment, there is evidence that agricultural productivity has declined in recent years. This paper attempts to explore the question of what are the sources of this decline in agricultural productivity. We use secondary data from six regions of Botswana (covering the period 1979 to 2011) to construct a Färe-Primont index of productivity following the approach proposed by O’Donnell (2012). The results of this analysis will provide policy makers information to valuate and assess whether the payoffs from improving the rate of technical progress (e.g., through increased R&D expenditure) are more likely to outweigh the benefits from improving either levels of technical efficiency or scale and mix efficiency. KEYWORDS: Data Envelopment Analysis, Färe-Primont index, Total Factor Productivity, Technical Change, Efficiency Change, Scale Efficiency, Mix Efficiency.

APPC 2014 July 8 - 11

Book of Abstracts 18 | P a g e

ID: 066 Analyze productivity and risk of Taiwan banks with meta frontier environment productivity index Authors: Yung-Lieh Yang, Zhen-ming Chen, Zi-jun Sheng and Ju-long Su Contact: [email protected] , [email protected] , [email protected] , [email protected] Category: DEA - Applications to Banking Abstract: Banking industry is more than a pivot of funds intermediary in economic system after two financial

innovations. Banking operation needs multiple investments which with risk come along. Recent researches about efficiency of banking industry start to focus on banking operation under risk. Most of these researches agree that, leaving aside the risk effect, the estimating of banking efficiency is easy to bias. Oh (2010) consider the meta frontier and unexpected productivity, proposed the meta frontier Malmquist-Luenberger productivity growth index (MML). It can simultaneously estimate how the firm’s property and risk affect the efficiency and productivity. In this project, we compare banks that have merger during the financial innovations with the banks without merger. We employ the MML model of Oh (2010) to estimate the difference of catching-up effect, innovation effect and technological leading effect between this two kinds of banks. And give our advice according to the result.

ID: 067 Effects of Chinese Enterprise Reforms on Efficiency and Productivity: Revisited Authors: Ying Chu Ng and Suthathip Yaisawarng Contact: [email protected] , [email protected] Category: DEA - Other Applications Abstract: Since 1978, the Chinese Government has implemented a series of the enterprises and industrial reforms.

These reforms gradually transform select major industries in China into a mix of ownership types at different pace across regions. These reforms are used to rescue large SOEs that were operating at loss with high debt-equity ratio and release small- and medium-sized SOEs to exit the industry or become privatized. It also allows foreign investors to participate in the market. As a result, three major ownership categories emerged: (1) SOEs, (2) privatized SOEs, and (3) newly established private enterprises. Private enterprises have better ability to raise capital investment through joint venture and stock market, access to advanced technology and better management skills. This paper uses firm-level data to compare impacts of enterprise reforms in a labor-intensive government control industry with a capital-intensive, competitive market industry. We choose the Chinese textile industry for the former and the computer industry for the latter. Historically, the textile industry was severely inefficient and non-profitable. This industry was a key reform in late 1990s and was transformed into a success story. For the textile industry, this study hypothesizes that (1) SOEs perform better than privatized SOEs and private firms, and (2) improving efficiency and productivity of SOEs are due to substitution of capital for labor. The computer industry, on the other hand, is capital intensive and would benefit more from foreign direct investment and exports. Enterprises in the computer industry operate in a more competitive environment. For this industry, we hypothesize that (1) private firms perform better than SOEs and privatized SOEs, and (2) FDI contributes toward superior performance through an advance in production technology. We create a unique sample of SOEs, privatized SOEs, and private enterprises in 2004 and calculate input-oriented DEA meta-frontier efficiency scores for each industry to identify sources of inefficiencies, e.g., managerial inefficiency, scale inefficiency, and lack of access to the best available technology. We also compare the input mix of SOEs in 1998 with those remaining SOEs in 2004 to test whether there is a shift in capital-labor concentration. Our finding would be helpful for policy makers in customizing public policies to further enhance enterprise performance as well as industrial development.

APPC 2014 July 8 - 11

Book of Abstracts 19 | P a g e

ID: 068 Disaggregate Input Efficiency of Banks in Taiwan and China Authors: Jin-Li Hu , Tzu-Pu Chang and Tao Ling, Contact: [email protected] , [email protected] , [email protected] Category: SFA - Applications to Banking Abstract: This study analyzes total-factor input efficiency values of commercial banks operating in Taiwan and

mainland China by employing a stochastic frontier approach (SFA). We apply the method conducted by Zhou et al. (2012) to measure disaggregated input efficiency scores using the balanced panel data of 35 banks across the Taiwan-strait during the period from 2006 to 2011 summed up to have a total of 210 bank-year observations. There are three inputs (number of employees (Labor), total customer deposits (Fund), and fixed assets (Capital)), two output (gross loans and total earning assets) and four environment variables (the operation year, gross domestic saving rate, corruption perception index, and index of economic freedom). Moreover, for the study period covering over 6 years and having a range of cross-strait banking which may exist the discrepancy of the currency and exchange rate, the input and output items will have to be adjusted by GDP deflator adopting 2000 year as the base period and the data will also be converted into the unit in millions of US dollars to facilitate cross-strait information unified and interpreted. Our major empirical findings are as follows: As the corruption perception rate improves, employees will make more interests of the banks rather than private-oriented interests, which leads to enhance the total-factor labor efficiency values. Moreover, there is a learning effect such that a positive relation exists between total-factor labor efficiency and operation years. The senior employees may have more sense of familiarity toward the work field making positive influence. There is also a significantly negative correlation between the total-factor fund efficiency and operation years. A senior bank in drawing up its market strategy is more conservative and it can be understood that most of the cross-strait senior banks have state-owned background. Thus, with the deposits efficiency can not only be made considerations of profit-oriented as the ultimate or only goal, the efficiency of capital utilization is relatively inconspicuous. The statistics showed no correlation among total-factor capital efficiency and each environment variables indeed. One possible reason is that fixed assets accounted for a few proportion of a bank's total capital relatively; thus through the method, it cannot be found out the relevance.

ID: 069 Regulatory Incentives to Water Losses Reduction: The case of England and Wales Authors: Humberto Brea, Sergio Perelman and David Saal Contact: [email protected] Category: SFA - Applications to Regulation Abstract: In recent years, England and Wales have suffered droughts. This unusual situation defies the common belief

that British climate provides abundant water resources and has prompted the regulatory authorities to impose bans on superfluous uses of water. Furthermore, a large percentage of households in England consume unmetered water which is detrimental to water saving efforts. Our study estimates the shadow price of water using a panel data from reports published by the Office of Water Services (Ofwat) for the period 1996 to 2010 (three regulatory periods). We estimate a parametric multi-output multi-input input distance function and assume a translog technology. Additionally we controlled for other factors such as water quality, pressure, water source and whether the company is also engaged in sewerage related activities. Following O'Donnell and Coelli (2005), we use a Bayesian econometric framework in order to impose regularity – monotonicity and curvature – conditions on a high-flexible technology. Our estimations offer guidance for imposing penalties and provide an assessment of how the water distribution companies deal with water losses under the current regulatory regime. The approach used in this study follows the trend of treating a bad output as an input, which is more intuitive, if we consider the trade-off between a negative externality and investment in environmental friendly technologies and practices. We measure shadow prices in terms of capital and operating expenditures (OPEX) as a proxy of labour costs. We found that the shadow price of water in terms of capital decreases over time, this result is a paradox given the water distribution industry is becoming much more capital intensive. This apparent contradiction is solved by taking into account that the technical ability to reduce water losses with capital has improved over the analysed period. On the other hand, shadow prices of water in terms of OPEX increased which is expected given that OPEX decreased faster than the water losses. In general, the current trend is positive since companies would not require investing large amounts of capital for reducing water losses. The relevance of the study is quite general as water scarcity is a problem that will become more important with population growth and the impact of climate change.

APPC 2014 July 8 - 11

Book of Abstracts 20 | P a g e

ID: 070 Productivity, Firm Value Creation and its Distribution: The Spanish Health Care Case Authors: Emili Grifell – Tatjé and Pau Turon Dols Contact: [email protected] , [email protected] Category: Index Numbers - Applications Abstract: This paper applies methodological advances that have been developed in the economic field of efficiency

and productivity to the business research based on stakeholder theory. Healthcare expenditure has become a worrying question for developed economies and, in the case of Spain, it represented the 9.7% of GDP in the year 2009. The Spanish “National Health System” (NHS) is both the regulator and the main customer and, in 2005, it changed its health care policy exerting its strong capacity of influence over the pharmaceutical firms progressively. It was committed by political decision due to the drastic reduction of healthcare expenditure using indirect mechanisms, to later apply further direct measures of influence through the approval of laws which fixed much lower prices for the drugs and services. The regulator, i.e. the NHS, fixed lower prices and, at the same time, kept the obligatoriness of supply. We study this situation in the period 2003-2012 in the framework of the Spanish Healthcare sector, where a multinational pharmaceutical firm reacts to the pressure of the public sector (main stakeholder). This paper intends to fill the gap detected in the literature related to stakeholder theory by performing a detailed empirical study based on high quality information that exemplifies the dynamics of firm value creation and its distribution. The stakeholder context: The firm’s function of creating value is well known, but the social function of distributing the value created is not so obvious. This is accomplished when the firm pays for the employees’ salaries, compensates its suppliers, gives dividends to shareholders and pays taxes to the government and interest rates to the capital lenders. Ackoff (1994) noted that the activity of business is the only way known which simultaneously produces and distributes economic worth. The firm’s performance is then the result of a two-stage cycle: the value generation is the first stage and the capture (i.e. distribution) of the economic worth is the second one. Therefore, it is not appropriate to assess business performance just by discussing the first step and ignoring the second one. Hence, a central issue is to accurately define what is understood as “created economic worth or surplus”. This study follows Grifell-Tatjé & Lovell (2008, 2014) methodological approach, which proposes a theoretical context based on the theory of production and the exploitation of the duality between price and quantity indicators.

ID: 073 Orthogonality conditions for identification of joint production technologies Axiomatic nonparametric

approach to the estimation of stochastic distance functions Authors: Christopher Parmeter, Andrew Johnson and Timo Kuosmanen Contact: [email protected] Category: SFA - Semi & Non-parametric Approaches Abstract: The classic econometric approach treats productivity as a residual term of the standard microeconomic

production model. Critics of this approach argue that productivity shocks correlate with the input factors that are used as explanatory variables of the regression model, which causes an endogeneity problem. This paper sheds some new light on this issue from the perspective of the production theory. We first examine the standard cost minimization problem to demonstrate that even if the observed inputs and outputs are endogenous, consistent estimation of the input distance function is possible under certain conditions. This result reveals that the orthogonality conditions required for econometric identification critically depend on the specification of the distance metric, which suggests the directional distance function as one possible solution to the endogeneity problem. We then introduce a stochastic data generating process of joint production where all inputs and outputs correlate with inefficiency and noise. We show that an appropriately specified direction vector can provide the orthogonality conditions required for identification of the directional distance functions. A consistent nonparametric estimator of the directional distance function is developed, which satisfies the essential axioms of the production theory. Specification of the direction vector is examined through an application to electricity distribution firms

APPC 2014 July 8 - 11

Book of Abstracts 21 | P a g e

ID: 074 INCORPORATING NONPOSITIVE PROFIT IN A TRANSLOG PROFIT FUNCTION Authors: Christopher Parmeter and Jaap Bos Contact: [email protected] Category: SFA - Applications to Banking Abstract: The study of profit efficiency often begins with empirical specification of the profit function/frontier, for

which the translog functional form is generally agreed upon as a valid second order approximation to the true underlying profit function/frontier. However, it is common in applied work to encounter nonpositive profit, which implies that the logarithm of profit cannot exist. An extensive body of research (notably in the banking literature) has put forth various `fixes' when there are firms that have nonpositive profits. These include 1.Exclude the observations with nonpositive profit, 2. Add a constant number to all profit so that nonpositive profits become positive, 3. Censor nonpositive profit at 1, and include an indicator function on the right hand side capturing this censoring. None of the aforementioned approaches are technically valid. Excluding observations induces a nasty selection problem since it is likely that exact curvature of the profit function cannot adequately be characterized in the region where nonpositive profits exist and most likely (but not always) firms with nonpositive profit will be largely inefficient and so it is odd to assess profit inefficiency by eliminating the firms that \textit{a priori} are likely to be heavily inefficient. Adding a constant number to all profits dramatically changes the shape and variance of the distribution of profits. For example, one can show that Var(ln(x))>Var(ln(x+c)) for some constant c. The larger c the larger the reduction in variance. This has disastrous consequences on the subsequent empirical analysis. Lastly, censoring profit at 1, so that ln(1)=0, has no substantive theoretical statistical basis. This makes empirical justification of the method tenuous. This paper proposes a simple to deploy variable transformation that retains the elegant features of the translog as a second order approximation device, while allowing for negative profits. This new transformation is compared with existing approaches via Monte Carlo simulations as well as an illustrative example.

ID: 075 Local and Global Scale Characteristics in Non-Convex Nonparametric Technologies Authors: Giovanni Cesaroni, Kristiaan Kerstens and Ignace Van de Woestyne Contact: [email protected] , [email protected] , [email protected] Category: DEA - Theory Abstract: The purpose of this contribution is to empirically implement and supplement the proposals made by

Podinosvki (2004) to explore the nature of both global and local returns to scale in non-convex nonparametric technologies. In particular, employing some secondary data sets, we investigate the frequency of the special case of sub-constant global returns to scale. Furthermore, we look at the evolution of ray-average productivity per observation and check how often global and local returns to scale yield concordant and conflicting information.

ID: 078 Trade over distance for New Zealand firms: measurement and implications Authors: Guanyu Zheng Contact: [email protected] Category: SFA - Semi & Non-parametric Approaches Abstract: This paper investigates the proximity of firms to their customers to assess the extent to which different

industries trade their output over distance within New Zealand. At the sector level, the output of the primary sector is most easily traded across distance, followed by the goods-producing sector and then the services sector. However, these broad results mask considerable variation at the industry level. For example, some knowledge-intensive service industries are concentrated in Auckland and routinely sell their output to firms and households located in other regions of the country. The paper goes on to assess the impact of tradability on a number of firm characteristics. This shows that domestic tradability and exporting are linked, with firms that trade their output over distance within New Zealand more likely to export internationally. This correlation between domestic tradability and exporting suggests that firms producing low-tradability products for the local labour market are unlikely to increase scale through exporting. Data on firm size bears this out with a clear link between domestic tradability and firm size in the goods-producing and service sectors. Across sectors, with services generally less tradable domestically and with a lower export share relative to goods, the average firm in the service sector is much smaller than in the goods-producing sector. Finally, the paper presents evidence showing that firms producing tradable output in the goods and service sectors tend to have higher labour productivity than firms that service their local markets. This suggests that the extent to which firms trade their output over distance influences the potential for agglomeration, scale and competition, which are all key drivers of productivity. The focus of the paper is on measuring domestic tradability and its implications for firms. However, the link between tradability and labour productivity highlights the importance of policy interventions aimed at improving connectivity between the cities and regions of New Zealand. Given that service sector firms are more likely to be insulated within a local labour market, the results also highlight the importance of stimulating competition in services industries.

APPC 2014 July 8 - 11

Book of Abstracts 22 | P a g e

ID: 079 Luenberger Environmental Indicator: A New Approach to Measuring Productivity and Efficiency Authors: Md Azad and Tihomir Ancev Contact: [email protected] , [email protected] Category: Environmental Aspects in Productivity & Efficiency Abstract: The present study explores a new way of measuring productivity and efficiency by modifying the

conventional Luenberger productivity indicator. The Luenberger approach has so far been applied in productivity and efficiency measurement in time-varying contexts, and has been used mainly in international productivity growth comparisons and efficiency comparisons of production units over a period of time. This study is the first attempt to use the Luenberger approach in an alternative way by constructing Luenberger environmental indicator while considering cross sectional, spatially explicit data instead of time series data. The credibility of this environmental indicator is that it can be employed to measure environmentally adjusted performance of productive units across space by incorporating undesirable outputs in the production model. The use of the modified indicator enables us to conduct an environmentally adjusted productivity and efficiency analysis. An empirical application of this model is to the Australian irrigation sector, considering 17 natural resource management regions within the Murray-Darling Basin. Findings show that environmental adjusted efficiencies of irrigated enterprises are considerably varied across the regions. The difference in environmental efficiency is largely driven by technological rather than efficiency variation, which implies that the difference in environmental efficiency of irrigated enterprises across regions is mostly due to the technological variation. This newly developed model can be widely used in any agricultural sector and elsewhere in the economy to measure relative productivity and environmental efficiency across spaces.

ID: 082 Dissections of Input and Output Efficiency: A Generalized Stochastic Frontier Model Authors: Subal C. Kumbhakar and Mike Tsionas Contact: [email protected] , [email protected] Category: SFA - Theory Abstract: In this paper we consider a specification that looks similar to directional distance function (DDF) but it does

not restrict in choosing the direction \emph{a priori}, and the directions are not same for all pairs of inputs and outputs. Instead of focusing on directions, we focus on proportional reduction in inputs and increase in outputs which are input and output-specific. Note that directions are important to determine reduction in inputs and increase in outputs. If these proportions are known, the direction can be automatically determined. Thus, in our formulation the focus is not on direction but proportional reduction in inputs and increase in outputs. Since we deal with proportions, our measures are not affected by units of measurement and we can use the popular functional forms such as the Cobb-Douglas and the translog. We specify the production technology in terms of a transformation function which is augmented to include inefficiency in $Y$ and $X$, i.e., $F(\lambda \odot Y, \theta \odot X) =A$, where $\lambda_m \ge 1$ indicates inefficiency in output $Y_m$ and $\theta_j \le 1$ indicates inefficiency in the use of input $j$. Note that here $\lambda$ and $\theta$ are output- and input-specific, and $\odot$ represents the Hadamard product. We call it generalized stochastic frontier(GSF) model, where the stochastic part of it comes from $A$ specified as $A= A_0 e^v$. Here $v$ is a random noise term with zero mean. Similar to the DDF, the SF models used in the literature can be shown as special cases of GSF. For example, if $\theta_j = 1 ~\forall j$ and $\lambda_m = \lambda ~\forall m$ the GSF model reduces to the output-oriented technical inefficiency model. Similarly, if if $\lambda_m = 1 ~\forall m$ and $\theta_j = \theta ~\forall j$ the GSF model reduces to the input-oriented technical inefficiency model. Finally, if $\lambda_m = \lambda ~\forall m$, $\theta_j = \theta ~\forall j$ and $\lambda \times \theta =1$, the GSF model reduces to the hyperbolic inefficiency model. Another special case (in a single output model) is when $\lambda=1$. This would be a model to estimate input slacks, given output. We use generalized method of moments to estimate the model.

ID: 083 Aggregation and Productivity Measurement with Multiple Outputs and Inputs: Analytics and Empirics Authors: Prasada Rao, Knox Lovell and Antionio Peyrache Contact: [email protected] , [email protected] , [email protected] Category: Index Numbers - Applications Abstract: The paper examines the problem of aggregation in productivity measurement involving multiple inputs and

outputs. Various aggregation measures proposed in the literature are examined with special focus on their conceptual framework, analytical properties and on their effect on productivity measures. The empirical part of the paper presents two illustrations – one based on micro-level data and another based on a macro-level data set.

APPC 2014 July 8 - 11

Book of Abstracts 23 | P a g e

ID: 084 Mix inefficiency and its determinants in integrated crops-livestock production systems in Ghana: A

nonparametric approach Authors: Renato A. Villano, Ian W. Patrick and George E. Battese Contact: [email protected] , [email protected] , [email protected] Category: DEA - Applications to Agriculture & Fisheries Abstract: The purpose of this paper is to examine mix inefficiency in integrated crops-livestock production systems in

Ghana. Mix efficiency occurs when changes in input or output mixes result in a potential enhancement in productivity. It is likely to occur in crops-livestock farming systems because most of the inputs used, such as land, labour and capital, tend to be fixed in the short run. In addition, time lags in the adoption of improved technologies, and risks associated with input allocation, also lead to limitations in movements around the frontier isoquants. Further, the adaptation of crops and livestock to certain production conditions such as water requirements, drought tolerance and the market demand for the products also leads to restrictions on output mixes available to the farmers. Considering the difficulty of smallholders to increase scale due to different production issues, mix efficiency offers smallholders the opportunity to increase their productivity either through allocation of the same inputs over a combination of outputs or changing the input mix to obtain a given output mix. This paper uses cross-sectional data collected on 600 farm households to assess the factors influencing mix inefficiency of integrated crop-livestock producers in Ghana. Using expenditure on major crops and livestock alongside other inputs, like land, labour and capital, and the output variables, quantities of crops harvested and livestock managed, a nonparametric data envelopment analysis is used to estimate the multiplicatively complete Färe-Primont productivity index. This index is decomposed into various efficiency measures including mix efficiency and the factors influencing mix efficiency are examined. Our results reveal evidence of mix inefficiency in crop-livestock production in Ghana. Further, age, gender, distance to nearest roads, extension contacts, membership of farmer-based organizations, household size, access to credit, farm income and the number of crops and livestock produced significantly influenced mix inefficiency of crop-livestock producers. For changes in scope to be translated into enhanced productive efficiency in crop-livestock production in Ghana, these factors needs to be considered by policy makers and other development partners. In addition, interventions aimed at enhancing the development and promotion of integrated crops-livestock production in Sub Saharan Africa should underscore mix efficiency of producers in such systems.

ID: 085 The Analysis on Environmental TFP and Influencing Factors of China’s Service Sector: 1998-2010 Authors: Rui-zhi Pang, Xue-jie and Bai Yu-dong Contact: [email protected] , [email protected] , [email protected] Category: Environmental Aspects in Productivity & Efficiency Abstract: The Analysis of Environmental TFP and Influencing Factors of China’s Service Sector: 1998-2010 Now China

has experiencing the transition period of economic growth model, from excessive resources consumption to sustainable development model. The economic structure and industrial structure is on the path of upgrading. However, this process now has been handicapped by the low productivity of service sector. The service sector of China has been facing serious problems: low productivity, low ratio in GDP and high consumption of resources. Different from the previous analysis on efficiency and productivity evaluation of service sector, this paper evaluates the inefficiency value and environmental TFP of Chinese service sector considering the total energy consumption and CO2 emission. The paper built a framework of evaluating environmental TFP of service sector, which adopts the method of slacks-based sequential directional distance function and sequential Luenberger productivity indicator. With this framework, the paper measures the environmental TFP and efficiency of service sector in 29 Chinese provincial areas from 1998 to 2010. In addition, this paper also discusses the influencing factors of the TFP in China service sector and decomposition indicators. The study has found that: the environmental inefficiency of China's service industry is of a growing trend. The overuse of energy and the excess emission of CO2 are the main sources of the environmental inefficiency. But it shows significant regional diversity. Comparatively the environmental TFP of eastern region has increased the fastest, followed by central region, but western region has decreased. LTFP can be decomposed into LPTP, LPEC and LTPSC. We found that LPTP is the main contribution to productivity growth, LSEC and LTPSC has a positive effect on LTFP. It is worth noticing that in all regions the LPEC has shown a significant deterioration. Furthermore, the paper explores the influencing factors on the environmental TFP of China service sector, such as the extent of market orientation, GDP per capita, urbanization rate, FDI scale, human capital, proportion of high energy-consuming industries within service sector, and ICT level. There is some interesting founding. Key words: China, service sector, environmental TFP, influencing factors.

APPC 2014 July 8 - 11

Book of Abstracts 24 | P a g e

ID: 086 The Productivity of the Internet Measured from a Consumer Perspective Authors: Russel Cooper Contact: [email protected] Category: Other Topics Abstract: How productive is the Internet from a consumer perspective? While a number of studies have sought to

place an economy-wide value on the Internet, there has been no general agreement to this point on either the appropriate methodology or the likely validity of results. This paper attempts to add new evidence by utilizing a large data-set of home Internet users and exploiting duality theory to provide a very general but rigorous theoretical setting that backs up the econometric estimation. From a static utility maximizing perspective, information on consumers’ allocation of time to various types of Internet activities is studied analogously to the traditional consumer allocation problem, but subject of course to a time constraint rather than simply the traditional money budget constraint. The model utilized in this paper considers the consumer as subject to two budget constraints – one the traditional money expenditure constraint and the other a time budget constraint. In contrast to much of the literature these are not merged into one ‘full’ budget constraint but are treated as two separate constraints which nevertheless interact with interesting consequences. In addition the concept of consumer profit, originally attributed to Ragnar Frisch, is utilized. The consumer profit function with total Internet time given may be regarded as a type of variable profit function, with total Internet time treated like capital in the production literature. The difference, of course, is that total Internet time is not something that requires time to adjust (unless addiction is considered). Consequently, an instantaneous optimisation can be conducted to choose total Internet time to maximize consumer profit. Measures of consumer welfare are evaluated using the dual of the indirect utility function – viz. the (money) expenditure function – in a traditional Hicksian ‘compensating variation’ type of counterfactual analysis. However, in view of the possibility of endogeneity of total Internet time, there are at least two concepts of money expenditure that can be used – either treating total Internet time as optimized out (as a function of the shadow prices of time spent of various Internet activities as well as on the price of utility) or taking it as given. One objective of this research is to examine the difference in results under these two approaches.

ID: 087 Real-Financial Linkages through Financial Social Accounting Matrix(FSAM) for Emerging Market Economies:

An Extension of Asia KLEMS and WIOD Database Authors: Hak K. Pyo, Hyunbae Chun and Keun Hee Rhee Contact: [email protected] , [email protected] , [email protected] Category: Other Topics Abstract: Paper to be considered for presentation at the Special Session on Asia KLEMS chaired by Marcel Timmer,

Asia Pacific Conference 2014, July 8-11, Brisbane, Australia Real-Financial Linkages through Financial Social Accounting Matrix (FSAM) for Emerging Market Economies: An Extension of Asia KLEMS and WIOD Database Hak K. Pyo, Hyunbae Chun and Keun-Hee Rhee* Abstract One of the unique features of the recent global financial crisis of 2007-2008 is its rapid speed of transmission from the advanced economies where it was originated to the emerging market economies where it affected most and the wide scope of its impacts on real sectors of affected economies. We propose in the present paper to use a Financial Social Accounting Matrix (FSAM) framework based on Asia KLEMS and WIOD Database to identify the transmission mechanism of such financial crisis and assess its impacts on industries and households through the analysis on multiplier and redistribution effects. The recent applications of FSAM to small open economies are found in Emini and Fofack (2004) for Cameroon, Saddiqi and Salem (2006) and Leung and Secrieru (2011) for Canada, Seng, Azali and Chin (2009) for Malaysia, Hubic (2012) for Luxembourg and Pyo et.al (2011), Pyo Rhee and Lee (2012) and Pyo, Rhee and Song (2013) for the Republic of Korea. We demonstrate that the combined use of Flow of Funds data with Input-Output Matrix through WIOD database in a consistent FSAM framework can be linked to industry-based real sector models through EU KLEMS or Asia KLEMS database and to households income survey dataset to analyze multiplier and redistribution effects of such financial crisis on household income distribution. Keywords:Global Financial Crisis, Financial Social Accounting Matrix, Input-Output analysis, multiplier effects and income redistribution effects JEL Classifications:A13, G29 *Professor of Economics Emeritus, Seoul National University, Professor of Economics, Sogang University, and Senior Researcher, Korea Productivity Center respectively.

APPC 2014 July 8 - 11

Book of Abstracts 25 | P a g e

ID: 089 Semiparametric Penalized Moment Estimation of Multiple-Output Stochastic Frontier Production Functions Authors: Hung-pin Lai and Cliff J. Huang Contact: [email protected] , [email protected] Category: SFA - Semi & Non-parametric Approaches Abstract: Estimations of multiple-output multiple-input production technology and the associated production

efficiency are generally relied on either the nonparametric data envelopment analysis (DEA) approach or the parametric stochastic frontier analysis (SFA) estimation of distance functions. The former approach is known to be sensitive to noise data or measurement error, while the estimates of the distance function in the later approach is generally biased and inconsistent due to the correlation of the endogenous regressors with the composite error term in the regression. In this paper, we propose using the penalized moment approach to estimate a semiparametric spline formulation of the distance function. We suggest using the single-index model to formulate the input or output index in order to avoid the so-called “curse of dimensionality”. Using the asymptotics where the number of knots of the spline function is fixed, we are able to show the consistency and asymptotic normality of the estimators of all parameters.

ID: 090 TOWARDS A COMPOSITE PUBLIC SECTOR PERFORMANCE INDICATOR Authors: Giannis Karagiannis and Suzanna M. Paleologou, Contact: [email protected] , [email protected] Category: DEA - Applications to Public Sector Abstract: We use the BoD (Benefit-of-the-doubt) model, which essentially is a radial single-constant-input DEA

model, to construct Opportunity, Musgravian and Total Public Sector Performance (PSP) indicators for 23 industrialized OECD countries for 1990 and 2000. The construction of Opportunity and Musgravian indicators is based on seven sub-indicators, namely administration, education, health and public infrastructure one the one hand and distribution, stability and economic performance on the other, which in turn are aggregates of other sub-indicators. The multiplier formulation of the BoD models provides the estimates of the necessary weights to aggregate sub-indicators to indicators. We develop three alternative estimates based on weights obtained from self-appraisal, peer-appraisal and a common set of weights to be compared to the equal weights case commonly used in the literature. The former are directly obtained by applying the BoD model to the data at hand, the second are extracted from the cross efficiency matrix and the latter are obtained by a restricted least squared model with the dependent variable being the self-appraisal efficiency estimates and as independent variables the relevant sub-indicators.

ID: 091 PRODUCTIVITY MEASUREMENT IN RADIAL DEA MODELS WITH MULTIPLE CONSTANT INPUTS Authors: Giannis Karagiannis and C. A. Knox Lovell Contact: [email protected] , [email protected] Category: DEA - Theory Abstract: We consider productivity measurement based on radial DEA models with multiple constant inputs. We

show that in this case the Malmquist and the Hicks-Moorsteen productivity indices coincide and are multiplicatively complete, the choice of orientation for the measurement of productivity change does not matter, and there is a unique decomposition of productivity change containing three independent sources, namely technical efficiency change and the magnitude and output bias components of technical change. We also show that an aggregate productivity index is given by the simple arithmetic mean of individual productivity indices.

APPC 2014 July 8 - 11

Book of Abstracts 26 | P a g e

ID: 092 Using health-outcomes measures for analysing cost and quality in health care Authors: Fanny Goude, Cecilia Dahlgren, Clas Rehnberg and Emma Medin Contact: [email protected] , [email protected] , [email protected] , [email protected] Category: DEA - Applications to Health & Education Abstract: Introduction: Most health care systems are faced with the challenge of providing high-quality health care

services at reasonable costs. In order to investigate the relationship between cost and health-outcomes access to comparable indicators reflecting the quality of services provided is required. The national disease-specific quality registers in Sweden provide unique opportunities for detailed analysis of productivity at the clinical departmental level. The aim of this study was to illustrate how data from these registers could contribute to a better understanding of the relationship between cost and quality in health care services in Sweden. Method: The Data Envelopment Analysis (DEA) method was used to analyse the productivity in health care. Data based on patient level data available from linkable Swedish national registers were collected for 2009-2011. Input cost, outputs and health-outcomes were estimated by combining data from the registers. Several DEA-models were estimated, where the base model consisted of a single input (cost) and a single output (number of patients treated). Patient case-mix adjusted health-outcome measures were added in the following models. Results: The productivity analysis showed that the inclusion of health-outcome measures as a proxy for quality of the provided health care services have an impact of the efficiency scores and changes the ranking of the clinics in the benchmarking. Conclusion: The relationship between cost and the quality of provided health care services, measured by case-mix adjusted disease-specific health-outcome measures, is complex and there is no direct inverse relationship between cost and quality.

ID: 093 Assessing Research Productivity at Faculty and Department Level: The Case of Greek Departments of

Economics Authors: Giannis Karagiannis and Georgia Pashalidou Contact: [email protected] Category: DEA - Applications to Health & Education Abstract: In this paper we consider five variants of the Benefit of the Doubt (BoD) model, which is a radial single

constant input DEA model, to assess research productivity at both the faculty and the department level in Greek Departments of Economics during the period 2000-2010. Performance is evaluated on the basis of three research outputs, namely number of journal publications, their quality standards, and the number of citations. Once research productivity is estimated at the faculty member, the results are aggregated in a theoretically consistent way to obtain research performance at the department level. The results from the conventional BoD model are then compared and contrasted, in terms of ranking and aggregate (department) efficiency, with those obtained using common weights, imposing weights restrictions reflecting stakeholders (i.e., dean, rector, etc) preferences, evaluating cross efficiencies, and conducting VEA (value envelopment analysis).

ID: 097 Measuring firm level energy efficiency in Swedish industry Authors: Shanshan Zhang, Tommy Lundgren and Per-Olov Marklund, Contact: [email protected] , [email protected] , [email protected] Category: SFA - Applications to Energy Abstract: Greenhouse gas emissions are considered a key issue by the European Commission, and the use of energy

in a more competitive and sustainable manner have been reflected in various EU energy policies. Improving energy efficiency is placed as one of the top priorities. Therefore, it is necessary to understand and measure energy efficiency appropriately to be able to provide insightful policy implications. Our study focuses on industrial energy efficiency measurement of 4481 firms across 14 sectors from 2000 to 2008 in Swedish manufacturing. Energy efficiency is defined and formulated within the context of production theory, where energy demand frontiers are derived from two different aspects; cost minimization and profit maximization. The cost minimizing approach always achieves efficiency improvements by reducing energy consumption, which fits into the energy conservation idea that is embraced in most policies directed towards energy efficiency. The profit maximization approach allows efficiency improvements to take place even though energy use is expanding; firms use energy more efficiently while energy use per output unit is unchanged. The combination of energy demand modeling and Stochastic Frontier Analysis (SFA) frames an econometric approach for energy efficiency estimation in a panel data setting. The True Random Effects (TRE) frontier model is adopted in this study in an attempt to adequately separate out firm specific heterogeneity from inefficiency. Finally, we explore possible determinants of inefficiency and find that the EU ETS have not been a significant driver of energy efficiency, possibly due to low permit prices.

APPC 2014 July 8 - 11

Book of Abstracts 27 | P a g e

ID: 099 Evaluating firm level energy efficiency performance in Swedish manufacturing with linear programming

models Authors: Shanshan Zhang Contact: [email protected] , Category: DEA - Applications to Energy Abstract: The EU climate and energy targets for 2020, known as the “triple 20 by 2020” targets, aim at a 20%

reduction in greenhouse gas emissions; a 20% increase in the share of renewable energy consumption; and a 20% improvement in energy efficiency from the 1990 levels. Thus, measuring energy efficiency and increasing the understandings of its performance over time are important for policy makers. This study assesses the energy efficiency at firm level among all sectors of Swedish manufacturing from 2000 to 2008. In the study, firstly, we construct a joint production framework that contains both desirable and undesirable outputs, from which we are able to estimate a composite energy efficiency performance index by using a slacks-based input oriented Data Envelopment Analysis (DEA) model. In order to capture energy mix changes while estimating energy efficiency, we treat different energy types as distinct energy inputs. By using a unique firm level data, we estimate the energy efficiency performance index as well as the maximum energy-saving potentials for each firm. Additionally, we exclude undesirable outputs from the DEA model, and the regulation effects are calculated by comparing the two models: including bad outputs and not including bad outputs. Results show that the energy intensive sectors, such as pulp/paper and iron/steel, had achieved at a higher energy efficiency level along years. Results also show that environmental regulations have had positive impacts among nearly all sectors.

ID: 103 Labour dependence, income diversification, rural credit and technical efficiency of small-holder coffee

farms in the Dak Lak Province, Vietnam Authors: Thong Quoc Ho and Truong Van Pham Contact: [email protected] Category: Bootstrap for DEA or SFA Abstract: Vietnam coffee sector is not only important for the country’s economy but also for the global coffee

market, sine the country has been the second largest coffee producer and exporter in over the world. Increasing coffee production efficiency may benefit coffee producers and increase coffee supply for the market as well. However, very few studies have focused on coffee production efficiency and there is no previous studies have investigated the influence of socio-economic factors, e.g., livelihood, amount of credit, independence in the labour source and the difference in ethnic groups of coffee farmers, on coffee production efficiency. This study examines relationships income diversification, rural credit loan, labour dependence, ethnicity and technical efficiency in coffee production. A face-to-face survey of 143 coffee farming households was conducted in the Cu M’gar District, Dak Lak Province, Vietnam. The stochastic frontier model showed that mean of technical efficiency scores was 0.64 and inefficiency effect statistically and significantly existed. Consistently, both Maximum Likelihood Estimate (MLE) and Feasible Generalized Least Square (FGLS) indicated that higher level of diversity in income sources may negatively affected coffee production efficiency. Additionally, independence in labour resource for coffee farming may help farmers to increase technical efficiency of coffee production. Credit loan had a positive and statistically significant relationship with technical efficiency of coffee production. These relationships held especially true for smallholder coffee farms with ethnic minority households. The policy options of increasing credit loan, intensive investment in coffee production rather than diversifying income sources of coffee farmers and independent management strategies for labour sources were suggested as an integrated approach to improve technical efficiency in coffee production of smallholder coffee farms.

APPC 2014 July 8 - 11

Book of Abstracts 28 | P a g e

ID: 104 Integrating undesirable outputs in production technology: The case of greenhouse gas emissions in French meat

sheep farming Authors: Hervé Dakpo Philippe Jeanneaux and Laure Latruffe Contact: [email protected] , [email protected] , [email protected] Category: Environmental Aspects in Productivity & Efficiency Abstract: This paper aims at comparing various methods to integrate environmental externalities as undesirable outputs

in production technology with an application to agriculture. In the context of increasing concern regarding the state of the environment, it is necessary to integrate the by-production of environmental externalities in the assessment of economic performance. In the production frontier literature several methods have been proposed to integrate undesirable outputs, which may be appropriate for the case of agriculture. As early as 1989, Färe et al (1989) developed the Hyperbolic Efficiency Measure (HEM) which computes a radial expansion and contraction of good and bad outputs, in line with the weak disposability assumption of undesirable outputs. With the development of Directional Distance Function (DDF), Chung et al. (1997) proposed a new efficiency measure which allows the increase in desirable outputs and the reduction of bad outputs over a specific direction vector. The DDF approach assumes the weak disposability and the null-jointness of intended and unintended outputs. However, recently some critics were formulated against these commonly used models, in particular regarding the weak disposability assumption and its ability to explicitly capture the nature of undesirable outputs. Murty et al (2012) formulated a by-product technology which consists in estimating two sub-technologies: one for intended outputs and the other for unintended outputs. Sueyoshi and Goto (2012) proposed a new model based on two disposability concepts: natural and managerial disposability. There has been so far no discussion on the convergence or divergence of these methods. Our objective is then to carry on a systematic comparison of the aforementioned methods and to discuss their suitability to real data in agriculture, namely greenhouse gas emissions by meat sheep farms in French grassland areas. The application to the agricultural livestock sector is relevant because the complex interactions between agriculture and the environment can make difficult the choice of a method. The empirical application is on a sample of 1261 farm-year observations from 1987 to 2012. Results confirm the existence of improvement potentials (average eco-efficiency ranges from 0.55 to 0.89 over the period) but reveal differences between the approaches. HEM and DDF under the weak disposability assumption fail to meet all the requirements associated with the specificity of undesirable outputs.

ID: 105 Climate Change and Farm Business Productivity in Australian Broadacre Agriculture Authors: Nazrul Islam, Ross Kingwell, Vilaphonh Xayavong Contact: [email protected] , [email protected] , [email protected] , Category: DEA - Applications to Agriculture & Fisheries Abstract: The economic prosperity of nations relies heavily on the sustainable growth in agricultural productivity.

However, climate change and loss of productive land due to land use competition and environmental degradation are challenges to further improving agricultural productivity. This paper explores the effects of climate change and climate variability on agricultural productivity in a region of Australia. The region of the south-west of Australia has since the mid-1970s displayed a warming and drying trend in its climate; a change consistent with projected climate change. Relationships between productivity, profitability, climate variability and climate change are explored by using panel data of farms located in the region. The data are detailed descriptions of the physical and financial characteristics of each farm business. The data are analysed using parametric and non-parametric methods. This study yields insights about farm characteristics and management strategies that weaken or strengthen farm viability in the face of climate variability and climate change. This paper identifies adaptation strategies that have improved farm business resilience. Implications of the study’s findings for the R&D and technology needs of the farmers are briefly discussed. Also, discussed are the study’s implications for government policy that aims to mollify the adverse impacts of climate change in the region.

ID: 107 Direct, Indirect and Total Elasticities of Substitution and Complementarity Authors: Anthony Glass, Karligash Kenjegalieva, Robin Sickles and Thomas Weyman-Jones Contact: [email protected] , [email protected] , [email protected] , [email protected] Category: Production Theory Abstract: A recent feature of the applied spatial econometrics literature is that the fitted parameters from a model which

contains the spatial autoregressive variable cannot be interpreted as marginal effects. To address this issue it is recommended that direct, indirect and total marginal effects are computed. By blending this development in applied spatial econometrics with the well-established literature on non-spatial net and gross elasticities of substitution and complementarity, we develop direct, indirect and total elasticities of substitution and complementarity. Specifically, we extend ten familiar non-spatial elasticities of substitution and complementarity to the direct, indirect and total case. We apply our direct, indirect and total elasticities of substitution and complementarity in the context of a spatial Durbin translog cost function for state manufacturing in the U.S. for the period 1997-2008. Keywords: Spatial Econometrics; Production Duality; Spatial Durbin Translog Cost Function; Manufacturing. JEL Classification: C23; D24; L60.

APPC 2014 July 8 - 11

Book of Abstracts 29 | P a g e

ID: 109 Spatial Autoregressive and Spatial Durbin Stochastic Frontier Models for Panel Data with Asymmetric Efficiency

Spillovers Authors: Anthony Glass, Karligash Kenjegalieva and Robin Sickles Contact: [email protected] , [email protected] , [email protected] Category: SFA - Theory Abstract: We blend the literature on stochastic frontier models for non-spatial panel data with spatial econometric

techniques to develop spatial autoregressive and spatial Durbin stochastic frontier models which control for unobserved time-invariant heterogeneity. Efficiency is time-variant in both models and we assume a half-normal distribution for stochastic inefficiency. Both models are estimated using Maximum Likelihood and we propose a new iterative technique to maximize the concentrated log-likelihood function which is a combination of the approaches used in spatial econometrics and stochastic frontier analysis. Both models contain an endogenous spatial autoregressive term which we adjust for by including in the log-likelihood function the scaled logged determinant of the Jacobian of the transformation from the random disturbance to the dependent variable. Finally, we apply the estimators to an aggregate production frontier for 41 European countries over the period 1995-2008. In particular, we use the fitted models to discuss the asymmetry between efficiency spillovers to and from a country. Keywords: Spatial Stochastic Frontier Models; Asymmetric Efficiency Spillovers; Aggregate Production; European Countries. JEL Classification: C23; C51; D24.

ID: 111 Benchmarking gas distribution companies in Italy: challenges and opportunities Authors: Elena Fumagalli and Teresa Romano Contact: [email protected] , [email protected] Category: DEA - Applications to Regulation Abstract: This paper presents the first assessment of the efficiency of the Italian gas distribution sector after liberalization

in the year 2000. Current regulatory objectives include efficiency in operational costs and require the application of elements of yardstick competition to the activity of network management. We claim here that the high fragmentation and significant scale heterogeneity of the sector pose several difficulties in the application of this regulatory framework in practice. The objective of the present study is to highlight these challenges and to derive potential solutions. To this end, we investigate the characteristics of the Italian gas distribution sector, focusing on economies of scale, economies of density, the role of environmental variables as determinants of efficiency and, lastly, on productivity changes over time. Our work is based on a detailed and original database, collected with the support of the national energy regulatory authority. It includes 105 Italian gas distribution companies, observed over a period of four years (2008-2011). Data refer to the entire distribution activity as well as to its two main components: network management and retail service. In this study we employ a two-stage DEA (Data Envelopment Analysis) estimation based on a bootstrap technique, which provides statistical inference on the main drivers of companies’ efficiency. Overall, we find a low level of efficiency, but significant productivity changes over time, particularly at lower scale and lower levels of density. In general, our results support the idea that scale and density, together with geographical variables are relevant determinants of efficiency, although, these relationships do not seem linear. From these findings we derive several policy implications.

ID: 112 Causal Links between Financial Activity and Economic Productivity Growth: New Empirical Evidence Authors: Michael Graff Contact: [email protected] Category: Bootstrap for DEA or SFA Abstract: To clarify possible links between financial activity and economic performance and productivity , a series of tow-

wave path models (translating longitudinal Granger causality into a panel framework) is estimated. Referring to a large panel of countries with observations on proxies for financial sector activity and real performance, ranging from 1970 to 2012, it is shown that overall, finance was a so called supply-leading determinant of economic performance of economies. The data suggest, however, that there have been various changes to the finance-performance nexus. In particular, from about 1975 to 1980 and again from 2005 to 2010, financial activity tended to impair economic performance. The paper then refers to the standard models in the finance-performance nexus literature to reconcile the facts with the theory. It is shown that the "optimum finance" approach along with the Minsky model provide a plausible interpretation.

APPC 2014 July 8 - 11

Book of Abstracts 30 | P a g e

ID: 114 Assessing the Effect of High Performance Computing Capabilities on Academic Research Output Authors: Paul W. Wilson, Amy W. Apon, Linh B. Ngo and Michael E. Payne Contact: [email protected] , [email protected] , [email protected] , [email protected] Category: DEA - Applications to Health & Education Abstract: This paper uses nonparametric methods and some new results on hypothesis testing with nonparametric

efficiency estimators and applies these to analyze the effect of locally-available high performance computing (HPC) resources on universities' efficiency in producing research and other outputs. We find that locally-available HPC resources enhance the technical efficiency of research output in Chemistry, Civil Engineering, Computer Science, and Physics, but not in Biology, Economics, English, or History. Our research results provide a critical first step in a quantitative economic model for investments in HPC.

ID: 116 Industry Productivity Measures Based on the Vintage Model Authors: Finn Førsund Contact: [email protected] Category: DEA - Other Applications Abstract: Production function for an industry level are often based on estimating an average function and then blown up

to the aggregate level. However, if the production relations of the micro unit (firm) follow a vintage structure with crucial differences of substitution possibilities ex ante and ex post such an aggregation may give both limited and misleading information. The crucial difference at the micro level between substitution possibilities ex ante (before an investment is made) when capital is to be invested and ex post when capital is committed is that substitution between all factors of production including capital is possible as in a standard neoclassical production function, while the input coefficients of variable factors of production are frozen to given values. Capital with its technology characteristics is sunk and has no independent role except as defining the output capacity of the unit.

ID: 119 Productivity and Efficiency of Vietnamese Banking System: New Evidence using Färe-Primont Index Analysis Authors: Phuong Anh Nguyen and Michel Simion Contact: [email protected] , [email protected] Category: Index Numbers - Applications Abstract: The aim of this paper is to provide a clearer view on the recent evolution of the Vietnamese banking system

that can be useful to public authority when taking restructuring decisions and bank managers when comparing their productive performances with others. More precisely, this paper focuses on the evolution of productivity of Vietnamese banks over the recent period from 2008 to 2012, and on the evolution of the different components of this productivity: technical change, pure technical effi-ciency, and mix and scale efficiency. The methodology draws from recent developments in index theory dealing with the design of multiplicative-complete indexes in order to measure total factor productivity (TFP). These multiplicative-complete indexes are designed to satisfy all the economically relevant axioms and tests defined by the index number theory. Moreover, completeness is a suffi-cient condition for decomposing TFP indexes into the various components mentioned above. We choose to use Färe-Primont index that belongs to this family, because its computation requires only data on input and output quantities, which is the case of our data. Our paper differs from other papers dealing with the Vietnamese banking system, which make use of Malmquist indexes. The methodology is applied to a balanced panel of 29 Vietnamese commercial banks. Technology is described by three inputs (deposits, fixed assets, and total operating expense) and three outputs (customer loans, investment securities, and total operating income), following the intermediation approach of banking activity. The analysis of the results is carried out into three steps. The first step focuses on the characteriza-tion of Vietnamese banking technology using aggregate quantities of input and output used for the computation of productivity indices. The estimated production frontier increases from year to year, and moves along a common axis over the five years, revealing constant returns-to-scale. This hypothesis is not rejected by empirical testing. Then, the annual distributions of the individual productivity indexes and of the elements of their decomposition are compared. Vietnam-ese banks become more technically efficient over the period and the main source of differ-ences in productivity patterns between these banks comes from mix and scale inefficiency. Finally, hierarchical clustering is used to identify similarities between banks with respect to their temporal patterns of productivity.

APPC 2014 July 8 - 11

Book of Abstracts 31 | P a g e

ID: 120 Multiple Direction Vectors for Measuring Biased Technical Change Authors: Hideyuki Mizobuchi Contact: [email protected] Category: Index Numbers - Theory Abstract: Productivity growth is decomposed into technical change and technical efficiency change components. The

former component is constructed to measure the distance between the production frontiers of the two periods compared. The distance between two production frontiers is computed by the distance between two isoquants measured along a single direction vector. The isoquant, which is part of the production frontier conditional on a given output, represents the possible combination of inputs that can produce the same level of output. When technical change is not Hicks neutral and is biased towards certain factor inputs, the distance between the two isoquants differs depending on the choice of a direction vector. Thus, when adopting only one direction vector for measuring distances, we fail to capture the global picture of technical change, resulting in a biased estimate. To prevent this problem, we measure the distance between two isoquants utilizing two direction vectors. The distances measured by two direction vectors enables us to define an alternative measure of technical change. Along with a measure of technical efficiency change, it leads to an alternative measure and decomposition of productivity growth. Finally, we empirically compare our decomposition of productivity growth with the alternative decompositions using cross-country panel data during 1950–2011.

ID: 121 Using Bayesian Network to Analyze Tourists’ Behaviors – A Case of Brand’s Health Museum Authors: Hsin-Hung Wu Contact: [email protected] Category: Other Topics Abstract: Tourism factory project initiated by the government in Taiwan encourages the traditional manufacturing

companies with unique, industrial history, and culture to transform into tourism factories. Companies can establish a bond between consumers and the brand, generate additional income from entrance tickets and on-site sales, and eventually add values for service innovation. However, the tourists’ purchase behavior is the key factor for a successful tourism factory. Besides, the purchase behavior could be dependent upon the demographic variables. In this study, a case of Brand’s Health Museum is chosen to analyze the tourists’ purchase behavior based on the demographic variables by applying Bayesian network. That is, the purchase behavior can be expressed in terms of probabilities rather than the deterministic expressions. The internal questionnaire developed by Brand’s Health Museum is used, where the demographic variables include gender, age group, type of visit, and living region. The purchase behavior is intended to study if the tourists are to purchase or have purchased the items from the gift shop. The total sample of size is 869, 549 visitors are female (63.2%), 728 visitors are individual visitors (83.8%), most of the visitors are 20 years old and below (27.9%), 21-30 years old (28.0%), and 31-40 years old (30.3%), and the majority of visitors live in central Taiwan area, which is close to Brand’s Health Museum. The software of IBM SPSS Modeler 14.2 is used. Tree augmented naïve is chosen to be the model for Bayesian network. The percentages of training data set and testing data set are 70% and 30%, respectively. The parameters in Bayesian network are default values. By observing Bayesian graphical model, age group is the most critical variable. Given the age group of 31-40 years old, the purchase behavior has the highest probability of 0.8733, while the probability of the purchase behavior is the lowest with 0.6216 followed by the age group of 51 years old and above. The respective probabilities for age groups of 21-30 and 41-50 are 0.7482 and 0.7808. From managerial viewpoints, the target of marketing strategy is age group of 31-40 in terms of purchase behavior probability followed by age groups of 41-50 and 21-30.

APPC 2014 July 8 - 11

Book of Abstracts 32 | P a g e

ID: 122 Sequential metafrontier cost Malmquist productivity index: an application to commercial banks in Asian

countries Authors: Chia-Hung Sun, Tsu-Tan Fu and Mei-Ying Huang Contact: [email protected] , [email protected] , [email protected] Category: SFA - Applications to Banking Abstract: The paper examines the effect of ownership and governance on bank performance in China. To investigate

potential impact of changes in governance due to financial crisis in 2008 on bank performance, we employ 71 Chinese commercial banks during the period of 2006-2012. The performance indicator of a bank will include profit efficiency, ROA, ROE, and non-performing loan ratio. Commercial banks in China can be categorized into 4 types: state-owned, national shareholding, city and foreign, whereas variables used as the proxy for bank governance include board size, % of independent director, % of stock holding by directors. We employ the Stochastic Frontier Analysis to estimate the technical efficiency and profit efficiency of sample banks. Our preliminary results show that state-owned banks perform the best based on those profit related performance indicators (profit efficiency, ROA, ROE), which is followed by city banks, national shareholding, and foreign banks. The performance based on the non-performing loan ratio however provides different ranking for types of banks, which indicates that foreign and city banks perform better than state-owned and national shareholding banks. We further employ regression analysis to examine the significance of bank governance on bank performance in China for the whole sample period and sub-periods: before and after the 2008 financial crisis. The results are different by performance indicator. But in general, we find that bank governance have relatively weak relationship with bank performance, as compared to the relationship between bank ownership and bank performance. However, these findings are still very preliminary and deserve further investigation.

ID: 124 Panel unit root tests for convergence in Japanese rice Authors: Kondo KatsunobuJun, Jun Sasaki and Yasutaka Yamamoto Contact: [email protected] , [email protected] , [email protected] Category: DEA - Applications to Agriculture & Fisheries Abstract: Panel unit root tests for convergence in Japanese rice productivity Katsunobu Kondo (Asahikawa University) Jun

Sasaki (Tokyo University of Agriculture) Yasutaka Yamamoto (Hokkaido University) Kondo, Sasaki and Yamamoto (2010) show Japanese rice productivity gap across Japan’s domestic regions narrowed and converged for the period 1957-1995. Then after the period 1996, did Japanese rice productivity also converge across its domestic regions? In order to answer this question, convergence in total factor productivity (TFP) of Japanese rice across the regions is statistically tested for the period 1996-2006. In this study, panel unit root tests for convergence are performed using the results of regional TFPs of Japanese rice. Key words: productivity gap, convergence, panel unit root tests,

ID: 125 Liberalization Reform and Technical Efficiency in the Indonesian Manufacturing Sector Authors: Bernadetta Dwi Suatmi Contact: [email protected] Category: SFA - Applications to Macro Abstract: Liberalization Reform and Technical Efficiency in the Indonesian Manufacturing Sector Bernadetta Dwi Suatmi

School of Economics, Curtin Business School, Curtin University Perth, Western Australia, 6102 Email: [email protected] February 2014 Abstract The impact of trade reform on technical efficiency remains a controversial issue in the economic development. There are a number of empirical studies on the impact of trade reform on technical efficiency. Findings of existing empirical studies show mixed results on the impact of trade reform on technical efficiency. This study attempts to provide some evidence to help reconcile the difference in empirical evidence by examining the Indonesian manufacturing industries’ experience. This study investigates the technical efficiency of Indonesian manufacturing industries, particularly addressing the influence of liberalization. In this study, a stochastic frontier approach (SFA) is used to examine the technical efficiency Indonesian manufacturing industries over the period 1980-2000. With a firm-level panel data, individual estimations are made for each each industrial group categorized by two-digit International Standard for Industrial Classification (ISIC). The results show that trade reform variables have a favourable impact on technical efficiency. Data used for this study are obtained from Medium and Large Scale Industries provided by Indonesian Central Bureau of Statistics as well as other databases. Keywords: Indonesian manufacturing, stochastic frontier approach, liberalization, technical efficiency

APPC 2014 July 8 - 11

Book of Abstracts 33 | P a g e

ID: 126 Metafrontier Cost Malmquist Productivity Index:An Application to Commercial banks in Asia Authors: Mei-ying Huang and Jia-Ching Juo Contact: [email protected] Category: DEA - Applications to Banking Abstract: This study applies the data envelopment analysis (DEA) approach and extends the group-specific cost

Malmquist (CM) productivity index of Maniadakis and Thanassoulis (2004) to define the meta cost Malmquist productivity index under the meta production technology. Parallel to the technology gap defined in the relationship between a group-specific adopted technology and its meta potential technology, a CM gap is defined as the ratio of two cost Malmquist productivity indexes that measure the convergence of the group-specific cost frontier to the meta cost frontier. A further decomposition of the CM gap is also developed. Panel data of Asia banks from countries such as China, Parkistan, Japan, Taiwan, Indonesia, India, Thailand and Philippines covering the period 2007-2011 are illustrated empirically to implement the decompositions of the CM index and CM gap. The cross country comparison is measured through productivity gaps and the sources of those gaps. Empirical results suggest different convergence paths of the cost Malmquist productivity growth for different sample countries. For all the sample countries, the country-specific cost Malmquist productivity barely improves over the meta cost Malmquist productivity. For the fastest convergent country, Philippine, its country-specific cost Malmquist productivity improves over the meta cost Malmquist productivity only by 0.90% ( =1.0090). On the other hand, Japanese banking diverges from the meta cost Malmquist productivity by 1.35 % ( =0.9865). The convergence (divergence) of country-specific production cost to (from) the meta production cost is almost nil.

ID: 127 The Industry-Level Productivity Comparison between Korea-Taiwan: evidence from Korea and Taiwan KLEMS

Database Authors: Tsu-Tan Fu, Hsing-chun Lin, Yih-ming Lin and Wei-sing Kong Contact: [email protected] , [email protected] , [email protected] , [email protected] Category: Growth Accounting Abstract: This is for the special session on ASIA KLEMS chaired by Marcel Timmer" In this paper, we would like to

illustrate a comparison between patterns of growth in Korea and Taiwan industries during the period between 1981 and 2010. Since the development of key industries in Korea and Taiwan is similar. Furthermore, competition among the firms in the two economies is intense. The firms of Korea are the main international competitors of Taiwan firms in the world market. They considered each other as rivalries. Therefore, the comparison is especially important in the formulation of growth policy for individual countries. We follow Jorgenson, Kuroda and Nishimizu (1987) to investigate the Korea-Taiwan industry-level productivity comparisons from 1981 to 2010. We further conduct cross-period and cross-industry comparisons on the structures of industry output growth and of the factor contribution in Taiwan and Korea. The data employed in this study is from the Korea and Taiwan KLEMS database.

ID: 129 The Contribution of Research and Innovation to Productivity and Economic Growth Authors: Kevin Fox and Amani Elnasri Contact: [email protected] Category: Growth Accounting Abstract: This paper examines the impact of investment in research and innovation on Australian market sector

productivity. While previous studies have largely focused on a narrow class of private sector intangible assets as a source of productivity gains, this paper shows that there is a broad range of other business sector intangible assets that can significantly affect productivity. Moreover, the paper pays special attention to the role played by public support for research and innovation in the economy. The empirical results suggest that there are significant spillovers to productivity from public sector R&D spending on research agencies and higher education. No evidence is found for productivity spillovers from indirect public support for the business enterprise sector, civil sector or defence R&D. These findings could have implications for government innovation policy as they provide insights into possible productivity gains from government funding reallocations.

APPC 2014 July 8 - 11

Book of Abstracts 34 | P a g e

ID: 131 Regulatory Reforms and Bank Efficiency in Indonesia: A Two-Stage Analysis Authors: Felisitas Defung, Ruhul Salim and Harry Bloch Contact: [email protected] , [email protected] , [email protected] Category: DEA - Applications to Banking Abstract: More than a decade, after severe economic crisis 1997, Indonesia has undergone major regulatory changes in

the banking industry. The restructuring program is continuing up to present to strengthen and improve the performance of the banking system. This paper examines the impact of regulatory changes to the relative technical efficiency (TE) of the Indonesian banking industry employing the data envelopment analysis (DEA) in the first stage and censored Tobit regression model in the second stage. To overcome the limitation regarding the lack of statistical inference in DEA, this paper employs the bootstrapping approach developed by Simar and Wilson (1998). This approach provides the bias corrected estimate and confidence intervals of DEA-efficiency score. The analysis covers 101 Indonesian commercial banks using 19 years of data (1993 – 2011). The finding shows that although the average industry technical efficiency is inefficient over the period of analysis, but it shows an improvement. State and foreign bank are found to be more efficient that other group of bank. On the second stage the impact of regulatory reforms is positive and significant at the industry level.

ID: 132 Matching the education system to the needs of the economy: evidence from Burkina Faso Authors: Élisé Wendlassida Miningou and Valérie Vierstraete Contact: [email protected] Category: DEA - Applications to Health & Education Abstract: Having a skilled labor is an important input for economic development. Like almost all developing countries,

Burkina Faso has launched several development projects aiming to improve the population’s human capital. While many projects are focusing on primary education, many others are concentrating on secondary and tertiary education. Education data show that these efforts have had impacts on the whole education sector. However, comparing the profile of skilled labor the education system produces and the structure of the economy, one could question the adequacy between the education system and the needs of the economy in terms of skilled labor. For instance, agriculture provides more than 35% of the GDP, while only 1% of the education programs concerns agriculture. In order to build proper education policies, it could be important to know the graduates’ specialties the economic sectors need in order to maximize the overall productivity of the economy. In the current paper, we are trying to investigate the issue of the matching between the education system and the needs of the whole economy in terms of skilled labor. The purpose is to evaluate the returns to scale associated with the labor force involved in different branches of the economic activities in Burkina Faso. Measuring returns to scale provides information about the optimal use of the specialized labor in these economic branches. A semi-parametric method proposed by Boussemart, et al (2009) is thus applied to adequately measure increasing returns to scale by relaxing the convexity assumption. Preliminary results show that 75% of the DMUs analyzed exhibit increasing returns to scale while only 10% have an optimal size. There is a strong heterogeneity between DMUs in terms of the intensity of these increasing returns to scale. Overall, our results show that improving the quantity and the quality of the labor force specialized in some specific fields could lead to important productivity gains.

APPC 2014 July 8 - 11

Book of Abstracts 35 | P a g e

ID: 133 Scope economies in Australian distance education Authors: Liang-Cheng Zhang, Jia-Jia Syu and Andrew Worthington Contact: [email protected] , [email protected] , [email protected] Category: ECR Day Abstract: The main purposes of present study are to estimate the scale and scope economies and the efficiency scores for

Australian universities. Main distinction between our study and previous researches is that we consider another instructional form of higher education. Since 1980s, distance education (or external studies)in Australia has not only played an important role of alternative mode of delivery but also become a competitive rival to on-campus (internal studies) instruction. Distance education could have more potential economies of scale than conventional education because the most costs from conventional education are variable and driven by student staff ratio. Another distinction is that we consider the unobserved heterogeneity among universities. Random Parameters Model (RPM) was used to account for unobserved heterogeneity on the parameters. Different from previous researches using simulated maximum likelihood to estimate random parametric model, our model will be under the Bayesian framework, because Bayesian approach naturally regards all parameters as random variables characterized by a prior distribution. Each vector parameter will be defined hierarchically as a random vector with a distribution and allow parameters in equation to vary with different alliance groups and time period. We employed a sample of 38 Australian universities over a three-year period (2010-2012), and adopted quadratic cost function with 7 outputs(undergraduate internal completions, postgraduate internal completions by coursework, postgraduate internal completions by research, undergraduate external completions, postgraduate external completions by coursework, postgraduate external completions by research, and Amount of National competitive and industry grants). The price variables are dropped because staff salaries have been determined nationally in Australia since 1988. All the monetary variables (cost and amount of national competitive and industry grants) will be converted to real values (2010 = 100) with consumer price index. The results indicated that different alliance group and time period can account for the heterogeneity. The weak cost complementarities were found between distance education and research grant. Except for the non-alliance institutions, most alliance members performed quite efficiently based on our efficiency scores. Relevant policy suggestions will also be provided in the present study.

ID: 135 Tailor-made DEA models: an analysis of European farming without price aggregation Authors: Victor Podinovski Contact: [email protected] Category: DEA - Applications to Agriculture & Fisheries Abstract: We report on the application of DEA to the analysis of efficiency and policy issues concerning European farms.

Unlike in many studies on this subject, we keep a large number of farm products disaggregated. This has an advantage of not imposing a particular price structure on the outputs. To cope with the high dimensionality of the resulting DEA models, we introduce expert judgements about the production trade-offs between different crops into the DEA models. We discuss the computation of efficiency and returns-to-scale characterisation in the resulting models, and various policy implications of our findings.

ID: 139 APEAR: A package for productivity and efficiency analysis in R Authors: Atakelty Hailu Contact: [email protected] Category: Other Topics Abstract: This paper presents a package for productivity and efficiency analysis in R (apear), with and without undesirable

outputs. The package allows one to use radial or directional efficiency measures and to choose between parametric and non-parametric frontier specifications. Utilities for estimating output shadow prices (abatement costs) are also provided as are example data sets that have been used in previously published journal articles.

ID: 142 A Cost Frontier Approach to Analysing Productivity Change in Ethiopian Manufacturing Authors: Christopher. J. O’Donnell and Addisu A. Lashitew Contact: [email protected] Category: SFA - Other Applications Abstract: We show how a cost function can be used to construct an input price index that satisfies important axioms from

index number theory, including the identity and transitivity axioms. We use this index to form a total factor productivity index that can be exhaustively decomposed into measures of technical change (capturing movements in the production frontier) and efficiency change (capturing movements towards, away from, or around the frontier). We illustrate the methodology using micro-level data on Ethiopian food and beverages manufacturing firms. We use econometric methods to estimate the parameters of a variable returns to scale cost frontier. We then report estimates of productivity change, technical change and different types of efficiency change.

APPC 2014 July 8 - 11

Book of Abstracts 36 | P a g e

ID: 144 The Structural Causes of Japan's Two Lost Decades Authors: Kenta Ikeuchi, Kyoji Fukao, YoungGak Kim and HyeogUg Kwon, Contact: [email protected] , [email protected] , [email protected] , [email protected] Category: Index Numbers - Applications Abstract: This is for the special session on ASIA KLEMS chaired by Marcel Timmer. Although by the early 2000s, Japan had

largely overcome its non-performing loan problems, economic growth hardly accelerated, resulting in what now are“two lost decades.” This paper examines the underlying reasons from a long-term and structural perspective using KLEMS type database and micro-level data. Major issues examined include the chronic lack of domestic demand since the mid-1970s, caused by a long-run decline in capital formation through a slowdown in the growth of the working age population and resulting in a current account surplus and yen appreciation, and supply-side issues such as stagnant TFP growth caused by Japan's low economic metabolism. A key finding is that whereas large firms since the mid-1990s have achieved greater increases in TFP than in the 1980s through R&D and internationalization, the TFP of small firms has stagnated.

ID: 145 An Application of Kourosh and Arash Model (KAM) in DEA to Measure the Efficiency of Iranian Airports Authors: Dariush Khezrimotlagh Contact: [email protected] Category: DEA - Other Applications Abstract: This paper reports the advantages of Kourosh and Arash Model to measure the performance evaluation

Decision Making Units (DMUs). KAM is a robust model in Data Envelopment Analysis (DEA) for measuring the relative efficiency of homogeneous DMUs. It simultaneously ranks, classifies and benchmarks both technically efficient and inefficient DMUs inclusive integer, real, controllable or non-controllable factors. KAM simply detects outliers, and measures the impact of each factor on the efficiency score. It is a flexible model with the same results as cost, revenue and profit efficiency models while the costs information and units’ prices are available. There is no concern to arrange DMUs by applying KAM even the number of DMUs is less than the number of factors. These assessments are illustrated by a numerical example of 50 Iranian airports inclusive 20 factors (12 inputs and 8 outputs). The simulations are also performed by Microsoft Excel Solver due to have a simple linear programming.

ID: 150 A Revenue System Model with Technical and Allocative Inefficiency Authors: Subal C. Kumbhakar, Frank Asche and Ragnar Tveteras Contact: [email protected] , [email protected] , [email protected] Category: SFA - Applications to Agriculture & Fisheries Abstract: In this paper we start from a multi-output transformation function in which the error terms appear either

additively or multiplicatively with the outputs. We consider a revenue maximizing framework in which the revenue function inherits the error terms from the transformation function. First, we adapt McElroy’s additive general error model (AGEM) to the transformation function and derive the revenue function. The error terms in the output supply functions, derived from the revenue function, inherit their stochasticity from the error terms from the transformation function. The second approach uses a multiplicative general error model (MGEM) in the spirit of Kumbhakar and Tsionas (2011). It is generalized to accommodate multiple outputs. The resulting revenue and revenue share functions inherit all the errors from the transformation function. Similar to McElroy, who used a cost minimizing behavior, we find that the output supply system is easier to estimate for the AGEM formulation. On the other hand, it is easier to estimate the revenue share system under the MGEM. The error terms under both specifications are shown to be related to technical and allocative inefficiency. More specifically, we show that the error structure in the revenue shares is directly related to allocative inefficiency, while the error term in the revenue function has a somewhat complicated form and is related to pure noise as well as technical and allocative inefficiency. We use vessel level data for the Norwegian whitefish fisheries for the period 1995 to 2007. For each vessel we observe the key vessel characteristics such as length, tonnage, crew size and days at sea. The vessels span a range from small coastal vessels at less than 10 meters to large trawlers, and the fleet uses a variety of gears like nets, long-lines, danish seines and trawls. Thus, the inputs are the capital (vessel), labor, and days at sea. The outputs are: cod, saithe and haddock, and the other species (mainly bycatch that we treat as a single group). Since the vessels are operated for 30 or more years we treat them as exogenous. So is labor since it is basically determined by the vessel size.

APPC 2014 July 8 - 11

Book of Abstracts 37 | P a g e

ID: 153 Corruption: Impediment or lubricant of economic growth? Authors: Gary D. Ferrier Contact: [email protected] Category: DEA - Applications to Macro Abstract: Conventional wisdom in economics has been that the relationship between institutional quality and economic

growth is monotonic. Recent theoretical and empirical work has called this into question, suggesting that the relationship is more complex. For example, corruption may have a negative effect on growth because of rent-seeking behavior or the theft of public goods; on the other hand, corruption may serve a positive role of circumventing weak institutions. In this paper, a DEA-congestion model is applied to a sample of country level data to examine corruption's impact of economic growth. “Corruption” (variously measured to examine robustness) is included as one of the inputs in the analysis. An analysis of whether corruption is a “congesting” input in different countries can be used to provide insight into whether the appropriate policy is a crack-down on corruption of an overhaul of institutions.

ID: 154 Treatment Effect Stochastic Frontier Models with Endogenous Selection Authors: Hung-Jen Wang, Yi-Ting Chen and Yu-Chin Hsu Contact: [email protected] , [email protected] Category: SFA - Theory Abstract: In many developing countries, government policies are instituted for the purpose of increasing productivity at the

industry as well as the firm levels. Examples are ample, and they range from policy deregulation and R&D subsidy aiming at an aggregate level, to training programs and services targeting specifically for small and medium enterprises. These policies affect the productivity through different channels. Some are clearly designed to advance production technology, some are tailored to enhance production efficiency given the technology, while others may have both ends in mind. Given that such policies are widely adopted by governments in developing countries, it is important for policy makers to have a method of evaluating policy effectiveness. Did the policy increase productivity at all? In particular, did the policy increase productivity through the technology or the efficiency channel? Did it work in the way it is designed for? Although the econometric literature have proposed method to evaluate program effects on the final outcome, to our knowledge none of the exiting method is able to discern channels through which the program effects the outcome. To fill the gap, we provide in this paper a treatment effect model of the stochastic frontier analysis. Since program participation is often voluntary-based, the endogeneity of treatment is considered in the model. The identification strategy follows from Abadie (2003), Frolich (2007, JoE), Hong and Nekipelov (2010) and Donald, Hsu and Lieli (2011). An empirical example of endogenous treatment effect evaluation is provided using a dataset on Indian villages.

ID: 156 Could investors in equity funds be risk-lovers? How to detect unexpected investors’ preferences with Data

Envelopment Analysis Authors: Albane Christine TARNAUD and LELEU Hervé Contact: [email protected] Category: ECR Day Abstract: Since Mullen & Strong proposed Data Envelopment Analysis in 1998 as a ‘potentially useful’ tool for performance

evaluation of financial assets, related works in that field that have used a methodology with DEA have implicitly assumed risk aversion of decision makers. In spite of the conclusions of a whole part of the literature in behavioral finance that opened new prospects on risk preferences, this basis assumption has not been questioned by any of the increasingly numerous works that use DEA for performance measurement of financial assets. Yet, the possibility to take into consideration preferences of risk lovers with DEA could be considered as an additional benefit this tool provides over traditional performance ratios in finance that fail to integrate these developments; but still, risk has always been assimilated to an undesirable output, no matter which measure was used to characterize it. Therefore, as potential risk-seeking behaviors have not been addressed in the literature, we propose to investigate some decision-makers’ preference on financial markets without restricting the analysis to risk-averse investors only. Opening the analysis to potential uncommon preferences implies some adjustments; we then consider for instance that risk is an output that cannot be freely disposed of but can only be reduced at a the cost of hedging. The set of axioms we propose defines our production technology in such a way that preferences of risk lovers can be determined. In order to determine whether or not risk-seeking behaviors can be found among investors of funds, we compare the efficiency scores of each fund under various direction vectors that can be interpreted in terms of preferences of the fund’s investors (we assume that their preferences perfectly match the funds’ characteristics in terms of risk and return). Our data set consists in the past distribution of returns on open-end equity funds invested in either by institutional or non-institutional investors. We chose a data set of open-end funds in order for the market forces of supply of demand not to impact the fund’s Net Asset Value. We will also use various measures of risk in order to obtain a more precise description of the investor’s preferences revealed by the funds’ characteristics. Eventually, the presence of risk-seeking investors could either be confirmed or invalidated on these funds’ markets and on each category of investors, depending on the results we find for our sample of funds.

APPC 2014 July 8 - 11

Book of Abstracts 38 | P a g e

ID: 158 CORRUPTION, ACCOUNTABILITY AND EFFICIENCY. AN APPLICATION TO MUNICIPAL SOLID WASTE SERVICES Authors: Davide Vannoni, Graziano Abrate, Fabrizio Erbetta and Giovanni Fraquelli Contact: [email protected] Category: SFA - Applications to Public Sector Abstract: Abstract This paper explores the link between accountability, corruption and efficiency in the context of a career

concern model where politically con- nected local monopolies are in charge of the provision of a local public service. We …find that both corruption and a low degree of accountabil- ity induce managers to reduce e¤ort levels, thereby contributing to drive down efficiency. Our predictions are tested using data on solid waste management services provided by a large sample of Italian municipalities. The results of the estimation of a stochastic cost frontier model provide robust evidence that high corruption levels and low degrees of accountability substantially increase cost inefficiency. Finally, we show that the negative impact of corruption is weaker for municipalities ruled by left-wing parties, while the positive impact of accountability is stronger if the refuse collection service is managed by limited liability companies.

ID: 161 Comparison of efficiency and production technology among the banking systems of Vietnam, China and India: a

stochastic cost and revenue meta-frontier approach Authors: Thanh Nguyen Contact: [email protected] Category: SFA - Applications to Banking Abstract: Given some similarities (central bank dependence, state dominance, regulatory restrictions and gradual reforms

towards liberalisation) and differences (supervisory authority and reform speed), this study estimates and compares cost and revenue efficiency and production technology among the banking systems of Vietnam, China and India over 1995-2011. Stochastic production frontier models for data of separate and all banking systems show that banks from Vietnam, China and India operate under different technologies, implying that the efficiency comparison must be done by meta-frontier which envelops all three country-frontiers. Under the current technology, the long-run average cost of each banking system almost attains the minimum while the long-run average revenue almost attains the maximum. Results derived from stochastic meta-frontier are that Chinese banks adopt the most superior technology in providing banking products to their customers, followed by Indian banks, then Vietnamese banks. Indian banks are as cost-efficient as Chinese banks but more cost-efficient than Vietnamese banks, while they are as revenue-efficient as Vietnamese banks but less revenue-efficient than Chinese banks. The major source of inefficiency is inferior production technology instead of managerial inefficiency. Among bank types, Indian state-owned banks and Chinese city commercial banks are the most cost-efficient while Chinese state-owned banks are the most revenue-efficient. These findings indicate that Vietnamese banks could improve their efficiency by borrowing the prevailing banking practices and technology in China and India.

ID: 163 Transitive estimates and an exhaustive decomposition of provincial productivity change in China Authors: James Laurenceson and Chris O’Donnell Contact: [email protected] Category: SFA - Applications to Macro Abstract: Improvements in productivity and efficiency lie at the heart of achieving sustainable economic growth in China.

This paper uses a geometric Young (GY) total factor productivity (TFP) index to measure and decompose productivity change in China’s 31 provinces over the period, 1978-2010. The GY TFP index is attractive because it can be exhaustively decomposed into three components of productivity change that have a sensible and straightforward interpretation in economic theory. It is also transitive meaning that productivity estimates can be reliably compared both across provinces and over time. The components of TFP change are estimated by applying least squares techniques to a dataset that consists of three outputs and six inputs, including capital stock estimates that feature province-specific investment price deflators and rates of depreciation that vary by province and sector. We find evidence of modest TFP growth across provinces with the rate being higher in coastal provinces than in inland provinces. TFP growth is found to mainly reflect changes in technology and the production environment rather than changes in efficiency, albeit there is some evidence of heterogeneity in the components of TFP growth at the provincial level. Policy implications are discussed.

APPC 2014 July 8 - 11

Book of Abstracts 39 | P a g e

ID: 169 China’s Post-Reform Growth, Structural Change and Productivity Performance Authors: Harry Wu Contact: [email protected] Category: Index Numbers - Applications Abstract: Invited by Marcel Timmer to present at the Session on Asia KLEMS Using a newly constructed KLEMS-style data

set for China, this paper provides a complete, quantitative view for China’s post-reform growth, structural changes and productivity performance at the sector level of the economy. The data are constructed by exactly following the concept of national accounts and in a coherent input-output framework for the period 1980-2010 (detailed data works are provided separately by three background papers with the data). Note that the official output accounts are not challenged but problems of serious inconsistencies and structural breaks have been carefully adjusted (the key adjustments are explained in this paper). Besides, alternative producer price indices are also constructed at sector level for double deflation. The productivity analysis in this paper is conducted in the standard Jorgenson-Griliches framework in which services of primary inputs are derived by their user costs and the weights of all inputs are, of course, coherently linked to the national income accounts. The productivity results are assessed and discussed against the background of China’s macroeconomic dynamics and shifts of policy regimes at different times of the reform period 1980-2010.

ID: 170 THE “PRODUCTIVITY RACE” IN MANUFACTURING BETWEEN CHINA AND INDIA IN 1980-2010: A PRODUCTION-PPP

APPROACH Authors: Harry Wu, Boon L. Lee and D.S. Prasada Rao Contact: [email protected] , [email protected] , [email protected] Category: Growth Accounting Abstract: In this study, we use the production-side purchasing power parity (PPP) approach to conduct industry-level

output and productivity comparisons between Indian and Chinese manufacturing. This production-side PPP method appropriately addresses the mismeasurement problems in the comparison of the same industries across countries when relying on market exchange rates which ignore the cost of nontradables within countries and lack of industry-of-origin implications. For our PPP-based comparison, we first construct a set of India/China industry-specific PPPs for 1995 using the available census data. We then construct industry-level employment, capital stock and value added data for both countries in the same framework that is designed for the PPP conversions of output and capital stock and for comparative growth accounting analysis. Our empirical exercises focus on the comparative level of real output, capital-labour ratio and total factor productivity in individual industries between the two countries. This allows us to investigate the time profiles of these important indicators, and by using China as the base country it also helps identify those industries in India that have experienced better productivity performance over their Chinese counterparts or those that have been overtaken by their Chinese counterparts. The results are discussed against the background of major reforms in the two economies.

ID: 174 Will Weighted-student Funding Enhance Equity in Texas? A Simulation using DEA Authors: Kathy Hayes, Shawna Grosskopf, Lori Taylor and William L. Weber Contact: [email protected] , [email protected] , [email protected] , [email protected] Category: DEA - Applications to Health & Education Abstract: Weighted student funding has become a popular strategy for enhancing within-district equity. With weighted

student funding, resources are allocated to the school level according to the demographic characteristics of the students. Students with greater perceived needs are allocated higher amounts. Here we explore the equity implications of weighted student funding for school districts in the Dallas, Houston, and San Antonio metropolitan areas. We use data envelopment analysis to model the educational production function, and then explore how a shift to weighted student funding (using the student weights embedded in the Texas School Finance Formula) would alter the observed allocation of inputs and potential outputs. We find that if school districts allocated their resources efficiently, then they would not allocate their resources according to the funding model weights. In our model school outputs are measured as value-added reading and math scores on standard achievement tests. The use of value-added test scores allows us to control for various home inputs that students bring with them to school and are treated as fixed inputs. Schools do have discretion over how their budgets are allocated, whether it be under weighted student funding or the status quo. We model the education production technology of schools using the direct and indirect production possibility sets. The direct production possibility set holds inputs constant and efficiency is measured as the potential gain in outputs given inputs. The indirect production possibility set holds the budget used to hire inputs constant, but allows schools to reallocate resources such as teachers, administrators, and non-personnel inputs, given input prices. Comparing the potential gain in outputs relative to the direct production possibility set with the potential gain in outputs relative to the indirect production possibility set allows us to measure pure technical inefficiency and input allocative inefficiency.

APPC 2014 July 8 - 11

Book of Abstracts 40 | P a g e

ID: 175 THE TECHNICAL EFFICIENCY OF WINE GRAPE GROWERS IN THE MURRAY–DARLING BASIN IN AUSTRALIA Authors: Tim Coelli and Orion Sanders, Contact: [email protected] , [email protected] Category: SFA - Applications to Agriculture & Fisheries Abstract: Australian wine grape growers have faced difficult market conditions in recent years, with reductions in grape

prices and increases in irrigation water prices having a significant effect on farm profitability. This has placed significant pressure on these farmers to be as efficient as possible. In this exploratory study, we use a subset of the Australian Bureau of Agricultural and Resource Economics and Sciences (ABARES) irrigation survey, consisting of four years of unbalanced panel data on 135 wine grape specialists (214 observations in total) from the Murray and Murrumbidgee River basins in Australia (from 2006-07 to 2009-10), to estimate production frontiers and estimate the efficiency levels of the individual farmers. Maximum likelihood methods are used to estimate translog stochastic production frontiers. Mean technical efficiency is estimated to be 79 per cent, with some farms achieving well below this level. This suggests some farms may face significant pressure if grape prices do not improve. However, it also suggests that there may be significant potential for these farmers to adjust to these pressures. A mean scale economies estimate of 1.07 is obtained, suggesting farm amalgamations may occur. Technical change is also estimated at 2.7 per cent per year. However, these estimates are based on a relatively small timeframe and may be biased by drought during the early survey period. Furthermore, we find that our shadow price estimates for irrigation water are above average market prices during the surveyed years. However, the price of water varied widely during the sample years and the average market price might not accurately reflect the price faced by irrigators during critical growing periods. Keywords: wine grape production, irrigation, scale economies, shadow prices

ID: 179 Technical Efficiency of Smallholder Horticultural Farmers in Ghana Authors: Freda Asem Contact: [email protected] Category: SFA - Applications to Agriculture & Fisheries Abstract: Growth in the international trade of high-value commodities has resulted in significant growth in many

developing countries. Trade in high-value agricultural products are increasingly displacing exports of traditional commodities. While market access remains important, Ghana’s major handicap is her inability to sustain export growth on the open market. The causes of these could be attributed to inefficiency, lack of competitiveness and supply side constraints. Over the nearly two decades since Ghana made the bold effort by entering the horticultural export market, it has made considerable progress. This notwithstanding, the positioning of Ghanaian products in the international markets is still weak. Even though Ghanaian horticultural exports have grown dramatically over the last ten years, these have yet to fulfil their potential. Given Ghana’s comparative advantage over its competitors, why is she still not able to achieve its full potential in the European market? Efficiency in production is increasingly becoming a necessary condition for entering and competing on global markets. To address these issues, this study examined the factors that affect efficiency of smallholder horticultural farmers in Ghana. Using survey data on about 6,000 farm households in 23 districts in Ghana, the study assessed the technical efficiency of smallholder horticultural farmers using the Translog functional form of the Stochastic Production Frontier. Results showed that fertilizer, seed and labour costs had positive and significant effects on the yields of pineapple, mango and chilli pepper. Fertilizer and herbicide were found to be risk-increasing inputs while tractor and labour were risk-decreasing inputs. Farm size was found to have a positive and significant effect on technical inefficiency of mango, pineapple and chilli pepper farmers. Older farmers were found to be less technically inefficient than younger farmers. Male farmers were also found to be more technically efficient than female farmers. Comparing the production of the farmers to their various frontiers, the average technical efficiency for chilli pepper, pineapple and mango farmers were 66.7%, 80% and 82% respectively. Recommendations made based on the findings were that fertilizer and seed should be made accessible and affordable to farmers. Also, formal education should be encouraged especially in rural areas and interventions to improve upon the productivity of farmers should be gender responsive.

APPC 2014 July 8 - 11

Book of Abstracts 41 | P a g e

ID: 186 MANAGERIAL PERFORMANCE AND TECHNICAL EFFICIENCY: A META-ANALYSIS OF THE FRONTIER LITERATURE Authors: Boris Bravo-Ureta, Roberto Jara-Rojas, Víctor Moreira and Lisandro Roco Contact: [email protected] Category: SFA - Applications to Agriculture & Fisheries Abstract: Farrell (1957) path-breaking paper has inspired a robust literature focusing on the measurement and analysis of

technical efficiency (TE). The earliest work relied on deterministic non-parametric approaches, then parametric deterministic followed by parametric stochastic frontiers and Data Envelopment Analysis. All of these models have generated a large volume of TE measures, some which has been synthesized in a few meta-analyses studies. It is fair to say that no matter how TE is measured, its definition as the gap between observed output and potential maximum output given the technology and a set of inputs is widely accepted. What is less clear is the meaning of TE and particularly the variables that should be included in models that seek to explain the variability in TE. This line of investigation originated with the so called two-step analysis but then gave way to one-step estimates imbedded in stochastic frontier models that have become widely used following the work of Battese and Coelli (1992; 1995). A few papers dating back to the 1960s, which have been forgotten and/or largely ignored, established a clear link between TE and managerial performance (e.g., Mundlak 1961). Subsequently, Martin and Page (2003) presented a clear link between TE and managerial effort. More recently, Triebs and Kumbhakar (2013) contend that management is an important factor of production but one that is difficult to observe and thus to measure directly. Instead, it is commonly treated as an unobservable and proxied by TE. The objective of our work is to take stock of how the applied literature has attempted to explain variability in TE. To provide a conceptual framework we first provide a synthesis of the work that connects TE and managerial performance. Then, building on earlier work (Bravo-Ureta et al. 2007), we undertake a meta analysis of the literature that has used one-step and two-step analysis to explain TE. After a thorough search in several databases (Web of ScienceTM; Agricola; Agris International; Ingenta; Science Direct; Social Science Citation Index, among others), a total of 150 suitable published papers have been identified. Following Knowler and Bradshaw (2007) and Prokopy et al. (2008) we are using “a vote count method” for a structured review exercise to analyze variables affecting TE. Thus far the results are mixed; however, the work continues both in terms of searching for more studies and performing additional analyses.

ID: 187 Evaluating productivity change using an unbalanced panel data approach: case of electric generation companies in

U.S. and Korea Authors: Yoonhwan Oh, Dong-hyun Oh and Jeong-dong Lee Contact: [email protected] , [email protected] , [email protected] Category: DEA - Applications to Energy Abstract: This paper presents an alternative framework for measuring productivity growth which can be applied to unbalanced

panel data and also considers the nature of technical progression. We adopted the directional distance function of the sequential production possibility set to the calculate methodology of the global indices. The proposed framework can overcome some shortcomings of the conventional Malmquist productivity indices that difficulty to adopt unbalanced data set and interpret the spurious technology deterioration. This framework is applied to the analysis of unbalanced panel data of electricity generation companies in U.S. and Korea observed from 2001 to 2010. We distinguish two main empirical findings: i) the technical change is main contributor to productivity growth, and ii) both countries are improved the productivity growth and the technical changes after the regulatory reforms.

ID: 189 Productivity Change in the Mid-Atlantic Surfclam and Ocean Quahog Individual Transferable Quota (ITQ) Fishery:

1980-2011 Authors: John Walden, Rolf Färe and Shawna Grosskopf,

Contact: [email protected] , [email protected] , [email protected]

Category: Environmental Aspects in Productivity & Efficiency

Abstract: A biomass adjusted productivity index is developed to measure productivity and productivity change for a commercial fishing fleet. The index was then used to examine productivity trends for the Mid-Atlantic surfclam and ocean quahog fleet before and after transition to an individual transferable quota (ITQ) management system. Changes in productivity are related to harvesting strategy, and the influence of entering and exiting vessels. Results show that after an initial productivity gain, productivity declined over the rest of the time period. Entering vessels had little impact on aggregate productivity, but on an individual basis, they eventually were equal in productivity to survivor vessels. Finally, there was a trend toward specialization and away from diversified production once the ITQ program began. Vessels that specialized in either surfclams or ocean quahogs were equally productive near the end of the study period.

ID: 192 Non-parametric Approach to Economic Growth and Convergence

Authors: Kelly Trihn and Valentin Zelenyuk

APPC 2014 July 8 - 11

Book of Abstracts 42 | P a g e

Contact: [email protected] , [email protected]

Category: SFA - Applications to Macro

Abstract: In this study we considered the stochastic data envelopment analysis (DEA), and the local maximum likelihood method to investigate the main drivers of economic growth, the convergence and divergence in the cross-country data used in Kumar and Russell (2002). Kumar and Russell (2002) conducted this study for a sample of 57 countries over the period 1965-1990 via the DEA approach. The method does not impose restrictions either on the production function, market structure or on the stochastic process governing the technical inefficiencies. However, DEA does not take into account of measurement error, which results in inconsistent and biased DEA estimators. This can be somewhat circumvented by employing the stochastic DEA approach proposed by Simar and Zelenyuk (2011). In addition, all observations are presumed to face with the same production frontier in the DEA framework, which might not be convincing in some cases. We circumvented this by breaking the sample into two groups (Organisation for Economic Cooperation and Development (OECD) and non OECD), and then adapted the local maximum likelihood method proposed by Park et al. (2013). The main goal of this paper is to investigate whether the results found in Kumar and Russell (2002) change significantly once various econometric approaches circumventing the aforementioned limitations of DEA are adapted.

ID: 193 A Conditional Nonparametric Analysis for Assessing the Impact of Stock Volatility on the Efficiency of Listed Commercial Banks

Authors: Luiza Bădin, Anamaria Aldea and Carmen Lipară

Contact: [email protected] , [email protected] , [email protected]

Category: DEA - Applications to Banking

Abstract: The latest financial crisis revealed significant imbalances within world's financial system. The soundness of all types of banks became one of the most debated topics worldwide among regulators, investors, public authorities and other interested third parties, given their importance to the economic development of any country. Several banks from different countries have been saved from bankruptcy by their Government authorities in different periods of time, whilst some others not, because of limited resources and/or lack of viability of such operations. In this paper we analyze the impact of stock volatility on the efficiency of listed commercial banks, through a nonparametric conditional methodology recently proposed by Bădin et al. (2012, 2014). The approach is fully nonparametric, does not rely on the restrictive separability condition between the attainable production set and the space of external factors and provides an insightful analysis on the components of the impact of external factors: impact on the shape of the frontier and impact on the distribution of inefficiency. Our study is focused on listed commercial banks from FactSet database, as their activity represents the primary function of banks, namely lending and borrowing financial resources. The results provide a perspective on how listed commercial banks performed after the first year of crisis and what were the effects of the initial measures taken by regulatory bodies and governments. Better performance can be related to lower magnitude of financial crisis in some regions of the world and to specific regulations active in these countries. Keywords: nonparametric frontier models, conditional efficiency, bandwidth, external factors, listed commercial banks. JEL Classification: C14, C40, C60, G15, G21. REFERENCES: Bădin, L., Daraio, C. and L. Simar (2012), How to measure the impact of environmental factors in a nonparametric production model, European Journal of Operational Research, 223 (3), 818-833. Bădin, L., Daraio, C. and L. Simar (2014), Explaining inefficiency in nonparametric production models: the state of the art, Annals of Operations Research, 214 (1), 5-30.

ID: 200 Determinants of Bank Profitability: Role of asset quality in Asia Authors: Nimesh Salike Contact: [email protected] Category: Other Topics Abstract: This study dwells upon the profitability of banks using panel data from 12 Asian economies over the years 2001-

2011. In particular, special attention has been made on the role of asset quality. After conducting the Hausman test, the study adopted fixed effect regression. We found that asset quality, measured by impaired loans over gross loans has significant and negative impact on banks’ profitability. Other bank specific variables, capital sufficiency, liquidity, income diversification and operating inefficiency, are important factors in determining the banks’ return. Further, the macroeconomic condition related factors- growth rate of real GDP, inflation and interest spread are also the determining factors in banks’ financial performance. We further classified the sample economies into two categories: advanced and non-advanced and found that banks in the non-advanced economies are more likely to experience higher profit margin. This paper suggests the decision-makers to design an effective incentive mechanism to align the managers’ interest in the banks’ long-term objectives. The rationale is that better financial performance could be achieved through lowering the requirements of getting loans to supply more. As a result, banks might face higher default risk. With respect to other bank specific factors, managers should attempt to reduce the operating cost and focus on the markets where they have comparative advantages, for the purpose of softening the shocks of intense competition and inherent imperfections. JEL codes: G20, G210, G280, M160

APPC 2014 July 8 - 11

Book of Abstracts 43 | P a g e

ID: 203 Investigating the technical efficiency of SMEs in non-resource sectors in a resource dependent and

developing economy: application of DEA bootstrap Authors: Lesego Sekwati Contact: [email protected] Category: ECR Abstract: In this study we apply the DEA bootstrap to investigate the efficiency of SMEs in non-resource sectors in a

resource dependent and developing economy, Botswana. We follow the method of Simar and Wilson (2007) in the regression of possible sources of inefficiency. We focus on the marginal impact of financing constraints, export orientation, firm age, the entrepreneur’s education and experience. We find considerable inefficiency in production units in our sample, with an estimated mean inefficiency score of about 45 percent. Regression results suggest a positive relationship between inefficiency and financing constraints. In the literature on resource dependence it is frequently suggested that resource sectors tend to squeeze out non-resource sectors in terms of access to resources, with possible negative consequences on firms operating in non-resource sectors. While financing constraints faced by SMEs in non-resource sectors cannot solely be attributed to possible effects of resource dependence (financial sector development could be important as well) our results suggest that it could well be that efficiency of SMEs in non-resource sectors is partly constrained by these effects. Our results also suggest a positive relationship between exporting and inefficiency. Our rationalization of this odd result is that exporting firms may be using sub-optimal technology. We also find a positive relationship between firm age and inefficiency. We think that younger SMEs possibly adopt new advances in technology at the time of their conception which makes them more efficient. Results on the entrepreneur's education suggest a negative relationship. Surprisingly, results show a positive relationship between the entrepreneur’s years of experience in industry and inefficiency. We expected that less experience would contribute to the entrepreneur’s inability to perform their functions satisfactorily. Our results do not support this expectation. We think that more years of experience do not necessarily translate into requisite business skills. We also think that younger entrepreneurs perhaps have benefited from improvements in the education system, hence contribute more to their firms than their older counterparts. Furthermore, there may be a predominance of fear of failure among older entrepreneurs which holds them back. We also suspect a predominance of entrepreneurs driven by necessity (associated with low impact on performance) among older entrepreneurs.

ID: 204 Productivity and R&D in Australian Broadacre Agriculture: A Semi-parametric Smooth Coefficient Model Authors: Farid Khan Contact: [email protected] Category: Bootstrap for DEA or SFA Abstract: Introduction In recent years, a body of research has been carried out to examine the influence of research

and development (hereafter, R&D) on total factor productivity (hereafter, TFP) in agricultural sector. A number of literatures provide empirical evidence that R&D is one of the main sources of productivity growth of any economy (Hall and Scobie, 2006; Griliches, 1979, 1988; Coe and Helpman, 1995). Investments in agricultural R&D and policies that affect agricultural R&D are the central to the improvements in agricultural productivity growth (Mullen, 2010a). They bring about a more effective use of existing resources and thereby raise the productivity level. Productivity growth in agriculture has also been an essential source of economic prosperity in Australia. However, in recent periods, studies suggest that at least some sectors of Australian agriculture has been slowing (Mullen and Crean, 2007; Nossal and Sheng 2010). One of the possible causes studies find is the fact that public investment in R&D has been falling over past decades (Mullen, 2010b; Alston and Pardey, 2001; Bervejillo et al., 2012; Warr, 2012). These recent phenomena in agriculture have renewed interest of exploring the relationship between public funding in agricultural R&D and productivity growth in the productivity analysis (Pardey et al. 2006; Sheng et al. 2010; Mullen 2010). Piesse and Thirtle (2010) mentions a slowdown and retargeting of public R&D as the first of the four factors that are primarily causing a slowdown in TFP growth in the United Kingdom. Few previous studies estimate the rate of return to R&D expenditure in Australian broadacre agriculture and indicate that public investment in agricultural R&D is contributing to TFP growth, but this has not been explored deeply and empirically.

APPC 2014 July 8 - 11

Book of Abstracts 44 | P a g e

ID: 205 Productivity and Efficiency of Leading European Electricity Companies in a time of Economic Downturn Authors: Osiris J. Parcero and Christopher O'Donnell Contact: [email protected] Category: DEA - Applications to Energy Abstract: This paper explores profitability differences among a sample of European electricity distributors. The paper

starts by decomposing profitability change into the product of a terms of trade (TT) index and a total factor productivity (TFP) index. The TFP index is a relatively new index that satisfies a suite of common-sense axioms from index number theory, including transitivity, circularity and identity. It also has the property that it can be decomposed into measures of technical change, technical efficiency change, and scale-mix efficiency change. This decomposition is exhaustive, and exact. This paper estimates the various components of TFP change using data envelopment analysis (DEA). The data are an unbalanced panel of observations drawn from the ORBIS database. The results suggest that, on average, TFP was down almost 25% due to an unproductive choice of input mix. Economic theory suggests that electricity distributors change input mixes in response to changes in (expected) input prices. The paper investigates the relationship between changes in input mix and changes in, for example, the price of oil.

ID: 211 Evaluating Changes in Total Factor Productivity in the Amendment 80 Catcher/Processor Fishery Authors: Ron Felthoven, Ben Fissel, Stephen Kasperski and Chris O'Donnell Contact: [email protected] Category: Index Numbers - Applications Abstract: Amendment 80 of the Bering Sea & Aleutian Island Groundfish Fisheries Management Plan instituted

groundfish retention standards (GRS) with the goal of reducing discards, which had been historically high for a particular fleet. The GRS continue to be a subject of regulatory attention as recent discard levels are markedly below the GRS targets. Coincident with the implementation of the GRS an economic cost-data collection program was established which will enable us to estimate the impact of these regulatory changes on the Amendment 80 fleet. Using the aggregate-quantity-price index framework of O’Donnell (2012a, 2012b) this paper explores the relationships among technologies, regulations and behavior as well as develops empirical estimates of the effects of groundfish retention standards on net revenues, productivity, and efficiency in the Amendment 80 fishery.

ID: 212 Evaluating the performance of Australian superannuation funds: A non-parametric approach

Authors: Mrs Yen H. Bui and Dr Sarath B. Delpachitra

Contact: [email protected]

Category: DEA - Other Applications

Abstract: This research estimates the relative efficiency of Australian superannuation funds using data envelopment analysis (DEA). Data for the research are sourced from the APRA database and financial reports of superannuation funds. The sample comprises 183 superannuation funds. This is approximately 79% of a population of 231 largest APRA-regulated funds as at 30 June 2012. The research covers a seven-year period, from 2005 to 2012. The findings indicate that most Australian superannuation funds are inefficient relative to the efficiency frontier, a benchmark established by efficient funds. Consequently, the efficiency targets (such as reduction in expenses and volatility of return) are very challenging for the majority of the funds. The findings emphasize the need for improving efficiency of Australian superannuation funds to narrow the gap in performance between efficient and inefficient funds. While volatility of return can be affected by external circumstances prevailing in the financial markets, expenses are in general more under the control of fund managers. The findings therefore provide a case for mandatory disclosure of fees and expenses in a comparable manner as a means toward accountability and justification of fund performance.

APPC 2014 July 8 - 11

Book of Abstracts 45 | P a g e

ID: 213 Benchmark Shadow Prices for an Extreme Point in Data Envelopment Analysis

Authors: Sungko Li and Long Zhao

Contact: [email protected]

Category: DEA - Theory

Abstract: In some DEA models, shadow price has an important role in decomposition of some indices. See Li and Ng (1995), Cheng and Li (2007), Färe and Zelenyuk (2003), and Nesterenko and Zelenyuk (2007). However, the non-uniqueness of shadow prices always hampers the applicability of these models. A recent study (Fang and Li 2013) verifies that in some cases, due to the non-uniqueness, conclusions drawn from models based on shadow price may be conflicted. Such non-uniqueness also impedes the information of shadow prices in other studies, for examples, see Liang et al. (2008) for cross efficiency, Amin et al. (2011) for clustering, Banker et al. (1984) for assessment of return to scale. We realize that non-uniqueness of shadow prices is an intrinsic property of DEA methodology. Since an extreme point is the intersection of a finite number of hyperplanes, a complete characterization of all possible shadow prices would require the identification of all the frontier facets and their intersections, see Fukuda (1993) and Rosen et al. (1998). Such an approach encounters greater computational difficulty when the number of inputs and outputs grows. Instead of characterizing shadow prices using all adjacent facets, this paper introduces an easy way to compute a unique benchmark shadow price vector at an extreme point. A unique hyperplane close to the extreme point is constructed following the algorithm introduced in this paper. The normal of this hyperplane is then a bench market shadow price at the extreme point. This benchmark shadow price vector has been proved to possess a number of desirable properties. A numerical example is provided to illustrate its superiority.

ID: 215 Is science-based innovation more productive? A firm-level study

Authors: Alfons Palangkaraya, Thomas Spurling and Elizabeth Webster

Contact: [email protected]

Category: Other Topics

Abstract: We use both an SME and a large enterprise firm-level dataset to test for the effect of innovation on future productivity. Regression analysis from the SME dataset (7724 Australian SMEs over the period 2005-06 to 2011-12) reveals that innovation raises productivity, from one to four years later, by 21 per cent. The strongest effects are derived from organisational and managerial process innovations and innovations sourced from science. If the firm also collaborated in one aspect of their activity, productivity rose by an additional 6 per cent. Regression analysis from the large enterprise dataset (220 Australian large firms over the period 2000-01 to 2011-12) shows that innovation on its own does not raise productivity, but innovation that is accompanied by verified ‘good management’ practices does raise later productivity. In both models, the measure of productivity is estimated from firms’ financial records while the innovation data is drawn from survey material.

APPC 2014 July 8 - 11

Book of Abstracts 46 | P a g e

ID: 216 Do the use of Imported Inputs and Absorptive Capacity Matter for Productivity? Evidence from

manufacturing firms in Ghana. Authors: Luke Emeka Okafor Contact: [email protected] Category: Growth Accounting Abstract: This paper investigates whether the use of imported inputs and absorptive capacity matter for productivity

using panel data from Ghanaian manufacturing firms over the period of 1991-2002. Recent theoretical and empirical evidence on firm heterogeneity and trade focuses mainly on exporting and foreign direct investment (FDI). Previous literature shows that internationally active firms tend to be more productive compared to domestically-oriented firms. However, most of the empirical studies that tested theoretical priors about trade and firm heterogeneity used firm-level data from developed or emerging economies. As a result, little is known about the effect of imported inputs on productivity for firms operating in sub-Saharan Africa. Most of the R&D activities are concentrated in developed economies. As a result, firms operating in developing countries may enjoy sizeable international knowledge spillovers. One of the channels for international transmission of knowledge is imported technology. However, the ability of a firm to assimilate new external knowledge might depend on its absorptive capacity. The commonly used proxies for absorptive capacity (ABC) in the existing literature fail to capture relative improvements or deteriorations in the ability of each firm to assimilate new external knowledge over time. In addition, most of these proxies exhibit low variability over time or tend to be correlated with firm-specific time invariant effects. A novel proxy of lagged relative initial productivity is proposed to capture the influence of ABC in the underlying link between imported inputs and productivity. ABC is measured in two ways. First, it is calculated as the natural logarithm of value-added per worker in the previous period minus the natural logarithm of value-added per worker for each firm when first observed in the sample. Second, it is measured as the natural logarithm of total factor productivity (TFP) in the previous period minus the natural logarithm of TFP for each firm when first observed in the sample. These two measures of ABC are positively and significantly correlated. Result shows that importing firms with higher absorptive capacity improve productivity by absorbing technology spillovers embodied in imported inputs contemporaneously and from past import activity. There is no strong evidence suggesting that importing firms with low absorptive capacity gain from the use of imported inputs.

ID: 217 Valuing water quality tradeoffs at the farm level: An integrated approach Authors: Moriah Bostian, Gerald Whittaker, Brad Barnhart, Rolf Fare and Shawna Grosskopf Contact: [email protected] Category: SFA – Applications to Agriculture and Fisheries Abstract: This study evaluates the tradeoff between agricultural production and water quality for individual

producers using an integrated economic-biophysical hybrid genetic algorithm. We apply a mulit-input, multi-output profit maximization model to detailed farm-level production data from the Oregon Willamette Valley to predict each producer's response to a targeted fertilizer tax policy. Their resulting production decisions are included in a biophysical model of basin-level soil and water quality. We use a hybrid genetic algorithm to integrate the economic and biophysical models into one multiiobective optimization problem, the joint maximization of farm profits and minimization of Nitrate runoff resulting from fertilizer usage. We then measure the tradeoffs between maximum profit and Nitrogen loading for individual farms, subject to the fertilizer tax policy. We find considerable variation in tradeoff values across the basin, which could be used to better target incentives for reducing Nitrogren loading to agricultural producers.

APPC 2014 July 8 - 11

Book of Abstracts 47 | P a g e

ID: 218 Farming efficiency in mixed smallholder crop-livestock agricultural systems in Uganda Authors: Aziz A. Karimov Contact: [email protected] Category: Bootstrap for DEA or SFA Abstract: Agriculture continues playing an important role in Uganda and remains the major source of rural

employment. The country’s 66 percent (UNHS 2009/10) of working population is engaged in this sector and highly depends on income coming from farming. This sector also provides raw materials to industrial sector and 40 percent of the export earnings are from agriculture. The priority agenda for the government is the development of sustainable production systems. This requires agricultural intensification which is key for achieving food security and reducing poverty, especially in rural areas. In this regard, the role of mixed farming is considered a favoured possibility for intensified land use. In such systems, efficient use of resources become even more important and creates some issues which require thorough consideration in public discussions and economic analysis. This study aims to obtain a better understanding of crop-livestock mixed system and investigates the issue by analyzing technical efficiency and its determinants using a two wave household panel survey dataset for the years 2005/2006 and 2009/2010. The both parametric and non-parametric methods are applied in the analysis to compare results. The stochastic frontier model considered monotonicity restrictions using the three-stage procedure recently developed by Henningsen and Henning (2009). Research findings indicate for efficiency improvements both in crop and livestock production. Furthermore, socio-economic and demographic characteristics of households, access to market, land tenure system, locational and agro-ecological variables impact technical efficiency enhancements. Based on the preliminary results, the study recommends development of extension services and enhancing their role in technology introduction and knowledge sharing. The attention of government should be towards investing in irrigation schemes and enforcing National Land Policy as well as providing incentives for households to sustainably use input resources in the most efficient way.

ID: 219 A Semiparametric Stochastic Frontier Model with Correlated Effects Authors: William E. Griffiths and Gholamreza Hajargasht Contact: [email protected] Category: SFA - Theory Abstract: We consider a semiparametric panel stochastic frontier model where one-sided firm effects representing

inefficiencies are correlated with the regressors. A form of the Chamberlain-Mundlak device is used to relate the logarithm of the effects to the regressors resulting in a lognormal distribution for the effects. The function describing the technology is modelled nonparametrically using penalised splines. The function in the lognormal distribution linking the effects and the regressors can be specified in a similar way, but given that estimating a model with nonparametric functions for both the technology and the effects is relatively ambitious, we focus on a linear parametric function for the effects. Both Bayesian and non-Bayesian approaches to estimation are considered, with an emphasis on Bayesian estimation. A Monte Carlo experiment is used to investigate the consequences of ignoring correlation between the effects and the regressors, and choosing the wrong functional form for the technology.

ID: 220 Unobserved heterogeneity and endogeneity in nonparametric frontier estimation Authors: Léopold Simar, Anne Vanhems and Ingrid Van Keilegom Contact: [email protected] Category: SFA - Theory Abstract: In production theory and efficiency analysis, firm efficiencies are measured by their distances to a

production frontier, which is the geometrical locus of optimal combinations of inputs and outputs. It is today recognized that in the presence of heterogenous conditions (like environmental factors) that influence the shape and the position of the frontier, traditional measures of efficiency obtained in the space of inputs/outputs have no sensible economic meaning. This is because the benchmark frontier may not be attainable for a firm facing different heterogenous conditions. Using a nonparametric approach, this can be corrected by using conditional frontiers and conditional efficiency scores developed in the literature. In this paper we extend these concepts in the case where the heterogeneity is not observed. We propose and analyze a model where the heterogeneity variable is linked to a particular input (or output). It is defined as the part of the input (or the output), independent from some instrumental variable through a nonseparable nonparametric model. We discuss endogeneity issues involved in this model. Under certain regularity assumptions, we show that the model is identified, we propose nonparametric estimators of the conditional frontier and the conditional efficiency score, and analyze their asymptotic properties. When using FDH estimators we prove the asymptotic convergence to a Weibull distribution, whereas when using the robust order-m estimators we obtain the asymptotic normality of the estimators. The method is illustrated with some simulated and real data examples. A Monte-Carlo experiment shows how the procedure works for finite samples.

APPC 2014 July 8 - 11

Book of Abstracts 48 | P a g e

ID: 221 A SBM measurement of Chinese bank efficiency with considering undesirable output Authors: Tsu-Tan Fu and Rui-zhi Pang Contact: [email protected] Category: SFA - Applications to Banking Abstract: This paper reports the slack-based measure (SBM) efficiency of banks in China during 2009-2011. To

investigate the impact of quality factor on bank efficiency, in addition to the conventional outputs (namely loans, investments and off-balanced sheet items), one undesirable output (non-performing loans) has been considered. We also consider risk factor in the model by treating the financial capital as a control input. A total of 61 commercial banks in 5 different types (state-owned, national stock-shared, city, rural, and foreign banks) are measured and cross-compared. Empirical results show that averagely the state-owned banks have the highest technical efficiency among all types of banks, this is followed by the national stock-shared banks. The average efficiency of the foreign banks has found to be higher than that of city banks and rural banks. We further analyze the determinants which explain the efficiency difference between banks using the regression model. Those factors such as innovation, profitability, cost control, cross-regional operation, and risk management capability are identified to have significant impacts on the efficiency of the banks in China.

ID: 222 Endogeneity in Stochastic Frontier Models Authors: Subal C Kumbhakar

Contact: [email protected] Category: SFA - Theory Abstract: Endogeneity in stochastic frontier (SF) models can arise from a variety of reasons. Mundlak (1961)

associated inefficiency with time-invariant management and suggested estimation of production function parameters free from management bias. In efficiency we are interested in estimating the production technology and observation-specific inefficiency in both cross-sectional and panel models. Consistent estimation of technology parameters using a production function requires inputs to be uncorrelated with both the noise and inefficiency components. Similarly, in output distance function (with multiple outputs), output ratios and inputs are to be orthogonal to inefficiency and noise components. The opposite is the case with input distance function. If these orthogonality conditions do not hold, endogeneity problem arises that causes inconsistent estimates and invalid estimates of inefficiency. Economically endogeneity arises because inputs and outputs might be decision variables (i.e., a producer decides how much output to produce and how much inputs to use). Optimal solutions of these decision variables are likely to be affected by inefficiency (if not noise). That is, since all these variables are simultaneously determined, anything that affects one is likely to affect others. This is labeled as the simultaneity problem. Correlations can also arise due to omitted variables (inputs and outputs), measurement errors in inputs and outputs, functional form misspecification, etc. In this paper we address the endogeneity problem associated with simultaneity. The standard textbook approach to deal with endogeneity problem is to use instrumental variables and apply GMM. GMM does require instruments for the endogenous variables. However, it does not require any economic behavioral assumption. GMM gives consistent estimates of the production technology parameters but not inefficiency which can be obtained in the second stage from the residuals using some assumptions on the distributions of inefficiency and noise. One can make GMM richer by making economic behavioral assumption (such as cost minimization, profit maximization, revenue maximization, maximization of return to the outlay, etc.). In using economic behavioral assumption, one assumes availability of prices which are nothing but instruments for the endogenous input and output variables. But these instruments are used in a way that make use of the first-order conditions (FOCs) of cost minimization, profit maximization, revenue maximization, maximization of return to the outlay, etc. The use of FOCs gives us a system on which GMM is used. The alternative to GMM is to use the ML method on the system defined by the technology and the associated FOCs. The use of ML method requires distributional assumptions on the noise and inefficiency components. The benefit of this is that we not only get consistent estimates of the technology parameters but also prediction of inefficiency as a by-product. The ML method can also be used to the dual (cost, profit, revenue) functions which are obtained by plugging the optimal values of the variables from the FOCs into the objective function (behavioral equation). By construction, the regressors in the dual functions (assuming that the economic behavior is appropriate) are exogenous. Thus, endogeneity problem can be avoided, in principle, by estimating the technology from the dual functions. One can use either a single equation or a system for this. Again the ML method gives consistent estimates of the technology and observation specific estimates of inefficiency. We discuss these issues for both multiple outputs and multiple inputs technology in both primal and dual settings. In the primal setting we use the transformation function and the associated FOCs. In the dual setting we use the dual cost/profit/revenue functions and estimate then using either in a single equation or a system framework. Both the technology and inefficiency are estimated jointly in a single step.

APPC 2014 July 8 - 11

Book of Abstracts 49 | P a g e

ID: 223 Ownership Structure and Efficiency in Banking: The Case of Ukraine Authors: Natalya Zelenyuk and Valentin Zelenyuk Contact: [email protected] Category: SFA - Applications to Banking Abstract: Banking sector plays a major role in any economy. It also played critical role in the transition of Ukraine to a

market economy. Before the crisis of 2008, it was one of the most dynamic and leading industries in the country, growing about 50% in assets annually, stimulating all other industries to grow. The banking sector was also among the first sectors in Ukraine to indicate about the coming crisis of 2008-2010 as well as was the first that entered the recession and served as a catalyst for it to spread over entire economy. Much of the hopes on recovery of Ukrainian economy from the crisis were also related to the resurrection of the banking sector. In this work, we investigate the hypothesis that despite the tremendous growth and visual prosperity before the crisis, the Ukrainian banking industry suffered from substantial inefficiency, which, when the crisis erupted, jeopardized its stability and, as a result, dramatized the crisis.