School of Business and Economics Faculty Research

Permanent link for this collection


Recent Submissions

Now showing 1 - 7 of 7
  • Item
    Factors influencing swift and effective resolution of supplier problems
    (Emerald Publishing Limited, 2020) Clemons, Rebecca; Baddam, Swathi Reddy; Henry, Raymond M.
    Purpose – How might an organization swiftly resolve supplier problems such that the issue does not reoccur? The purpose of this study seeks to understand the impact of different knowledge-sharing routines on measures of effective problem resolution. Design/methodology/approach – Data are collected from an automotive manufacturer’s (buyer) database. A hierarchical linear model analyzes dyadic data collected from 155 problems across 24 suppliers. Findings – This study reveals that different ways of communicating have differing impact on measures of effective problem-solving. Communication involving face-to-face interaction slows the process, whereas frequent communication can lead to swift resolution. Furthermore, management teams are more likely to lead to a “better” fix in that these teams are more likely to implement changes in the process or product. Research limitations/implications – The data are for a tier-one automotive supplier. Hence, the findings are limited by the extent to which other organizations may differ. Practical implications – The results provide insights for managers experiencing supply issues. Some forms of communication should be encouraged as they enhance the process. Moreover, the findings suggest there are consequences to pressuring a supplier to resolve a complaint quickly. Originality/value – Very few researchers can claim to have investigated observed collaborative mechanisms that occur between a buyer and its suppliers when resolving a problem. This research adds to the literature on the relational view theory as it applies to supply chain management and problem resolution.
  • Item
    Will Robots Agree to Pay Taxes? Further Tax Implications of Advanced AI
    (Indiana University East, 2020-05-18) Bogenschneider, Bret N.
    Will Robots Agree to Pay Taxes? Tax Ideology meets Advanced AI Brett N. Bogenschneider, PhD, JD, LLM Assistant Prof. Accounting & Taxation* Introduction a. Should robots pay taxes? Or, Will robots agree to pay taxes? b. Positing advanced AIs will engage in tax structuring (manipulation of the tax base) *Explanation of preference for income tax systems due to automatic efficiency gains therefrom c. Positing “tax actualing” by advanced AIs will supplant current methods of economic modeling
  • Item
    Should Robots Pay Taxes? Tax Policy in the Age of Automation
    (Harvard University. Text held by University of Surrey, SRI (Surrey Research Insight) in an Institutional Repository Republished in: Economic Law Review (Chinese), 2017) Bogenschneider, Bret N.; Abbott, Ryan
    Abstract: Existing technologies can already automate most work functions, and the cost of these technologies is decreasing at a time when human labor costs are increasing. This, combined with ongoing advances in computing, artificial intelligence, and robotics, has led experts to predict that automation will lead to significant job losses and worsening income inequality. Policy makers are actively debating how to deal with these problems, with most proposals focusing on investing in education to train workers in new job types, or investing in social benefits to distribute the gains of automation. The importance of tax policy has been neglected in this debate, which is unfortunate because such policies are critically important. The tax system incentivizes automation even in cases where it is not otherwise efficient. That is because the vast majority of tax revenue is now derived from labor income, so firms avoid taxes by eliminating employees. More importantly, when a machine replaces a person, the government loses a substantial amount of tax revenue— potentially trillions of dollars a year in the aggregate. All of this is the unintended result of a system designed to tax labor rather than capital. Such a system no longer works once the labor is capital. Robots are not good taxpayers. We argue that existing tax policies must be changed. The system should be at least “neutral” as between robot and human workers, and automation should not be allowed to reduce tax revenue. This could be achieved by disallowing corporate tax deductions for automated workers, creating an “automation tax” which mirrors existing unemployment schemes, granting offsetting tax preferences for human workers, levying a corporate self-employment tax, or increasing the corporate tax rate. We argue the ideal solution may be a combination of these proposals.
  • Item
    Wage Taxation and Public Health
    (Rutgers University, Camden: NJ, 2016) Bogenschneider, Bret N. BA, BA, JD, LLM, PhD
    The structure of a tax system is relevant to public health. Wage taxes are the predominant form of taxation in both Europe and the United States. Yet, high rates of wage taxation harm worker health, particularly when wage taxes are part of an overall regressive tax system. The causal mechanisms for the negative effects of wage taxes on public health occur by (1) pushing marginal workers into absolute poverty by payment of wage taxes; (2) increasing working hours for low-wage workers; (3) increasing levels of economic inequality by relatively higher tax and audit rates of persons with labor income; and (4) reducing financial (and time) investment in children by overburdened workers. Optimal tax policy accordingly requires an evaluation of the cost of wage taxes levied on “health capital” of workers as well as financial capital.
  • Item
    How do the Professional Ethics of Taxation account for Legal Indeterminacy
    (University of Missouri, 2020) Bogenschneider, Bret N.
    The modern framework of professional tax ethics is often given in reference to famous quotations of Justice Oliver Wendell Holmes or Judge Learned Hand. The common quote from Holmes is that “the very meaning of a line in the law is that you may intentionally go as close to it as you can if you do not pass it”; Hand’s quote is that “there is nothing sinister in so arranging affairs as to keep taxes as low as possible... [a taxpayer] is not bound to choose that pattern which will best pay the Treasury; there is not even a patriotic duty to increase one’s taxes.” However, there are two significant problems when these are applied to form the basis of tax ethics: First, Holmes’ idea of “crossing the line” is taken as a presumption that tax laws are legally determinate. They are not. Every tax practitioner ought to be aware that tax laws are not legally determinate. Accordingly, the limits of tax planning should not be expected to be clearly marked. Second, Hand’s premise of the legitimacy of “arranging affairs” raises the problem of structuring. By structuring, the tax practitioner creates a convoluted and indeterminate transaction out of a previously known set of facts. The respective “facts” then become slippery, just as Karl Llewellyn said, so the dream of tax law as a complete and fully valid set of intersecting code provisions dramatically falls apart. The Internal Revenue Service has struggled to respond to this challenge with new penalties and ever-changing tests. However, tax structuring represents a new animal in terms of legal philosophy comprising “Factual Indeterminacy”, where the underlying “facts” become indeterminate in various ways. This changes things for tax ethics because the standard line—“the lawyer applies the law to the facts”—is not an exclusive description of tax lawyering. By structuring, the tax lawyer is sometimes pushing toward indeterminacy. In nearly all other legal contexts lawyers push in the opposite direction away from indeterminacy. Various ethics scholars have proposed that the tax lawyer merely acts in different roles in different contexts, and that personal standards of ethics (or, morals) could serve as a guide to ethical lawyering. But, this approach appears to be merely a description of tax lawyering in various situations and not an ethical standard; any standard which merely refers to the idiosyncratic personal ethics or morality of the tax lawyer is tantamount to not having any ethical standard at all. The lack of professional standards should be expected to have catastrophic consequences for the tax profession especially for younger tax practitioners looking for ethical guidance. An illustration of an ethical dilemma is provided here using the actual terminology for responsibility for tax fraud within large corporations - “passing the monkey”. An alternative framework of professional tax ethics based on the direction of tax planning toward or away from indeterminacy is also proposed here.
  • Item
    How Accurate are Probabilistic Odds Claims in Criminal Trials? A “Warranted Skepticism” Approach
    (Mississippi Law Journal,, 2019) Bogenschneider, Bret N.
    Probabilistic odds claims used in criminal trials are often inaccurate due to subjectivity within the methods of forensic science. The potential sources of subjectivity are wide since forensic science is an adversarial process and a full disclosure of assumptions is not required under Brady. The prior literature has focused on the limits of Bayesian methods and the potential uniqueness of DNA fingerprints to each person. This paper is unique because it focuses on other sources of subjectivity, such as a lack of disclosure of test results, repeated trials, a presumption of laboratory accuracy, absence of tests, and so on. The infamous Gilyard case is helpful as illustration of one common source of subjectivity. The standard approach for DNA analysis was applied to first determine a “match” to Gilyard’s DNA to a test sample, and then to estimate the relative frequency of Gilyard’s DNA in a reference population at odds of 1-in-18-quadrillion. The odds figure is roughly a million times the number of persons now living or that will ever live and amounts to a strong claim that Gilyard's DNA profile is unique. The "reference population" used to derive the odds claim is hypothetical (or, largely non-existential), just as Karl Popper warned. However, in every criminal case with DNA evidence, there are at least 2 DNA samples that appear to match (in Gilyard, there were 7) not included in the reference population. A significant question is whether the reference population should include the samples at issue in the case. Where the composition of the reference population is merely hypothetical it presumably should be updated to reflect any new evidence as a matter of Bayesian science. Of course, the population is extrapolated from a small dataset, so the existence of 2 or more matching DNA profiles might reduce the odds to a figure perhaps in the thousands, rather than quadrillions. This raises the severe problem that the probabilistic odds calculation depends on the subjective determination of a “match” in the first step. One scholar has suggested that "match" claims are objective because the process has been partly mechanized, even though the interpretation of results has not been standardized. In modern science, however, objectivity refers to replicability by experiment including the interpretation of results. This paper develops many other additional sources of subjectivity in forensic science and suggests: If subjectivity exists to degree x, then any related or resulting probabilistic odds claim may not exceed x. As example, if laboratory error is possible at a rate of 1/10,000, then given probabilistic odds should not exceed 1/10,000. The inherent subjectivity of all probabilistic odds claims, as identified by Frank Ramsey, is also explored. The conclusion is that a “warranted skepticism” approach to the use of remote odds claims in criminal cases, such as in Daubert, remains appropriate.