39th Annual ORSNZ Conference 

Sunday 28th and Monday 29th November 2004

University of Auckland, Auckland, NZ

 

The following papers were presented at the 39 Annual Conference of the ORSNZ, held jointly with the 9th Annual International Conference on Industrial Engineering Theory, Applications and Practice (IJIE'04).

 

You may wish to view the final IJIE/ORSNZ program or the ORSNZ attendee list.

 

PaperID

AhmadI

Title

Paradigm Shift in Manufacturing Operation Through Implementation of Intelligent Dispatching

Abstract

Typically, wafer fabrication takes 40-60 days cycle time or more than 500 hundred processing steps that usually visited to the same tool, to complete the overall process. Thus its provides a very challenging problems in production planning and scheduling to maintain the cycle time variation between product to product because of the unique features of the wafer fabrication plant. Analyzing this issue, one of the breakdowns caused is at the queue time of the operator to make and analyze the decision to choose right lot to be dispatched. This paper will focus on the implementation of the intelligent dispatching that using commercial software to help to guide operator to select the right lot at the right time through real time lot sequencing in the FAB production floor at the respective tool. The result of the implementation has improved overall cycle time variation and bottleneck tool from 10 to 30% respectively. This project has successfully implemented in the real 200mm wafer foundry.

Authors

MA Chik, Ibrahim Ahmad, Mohd Yusoff Jamaludin

Speaker

Ibrahim Ahmad

Universiti Kebangsaan Malaysia

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

BeltranF

Title

A measure of the variability of revenue to an auctioneer: a look at the Revenue Equivalence Theorem.

Abstract

One not-so-intuitive result in the auction theory is the Theorem of Equivalent Revenue, which states that, as long as an auction complies with some conditions, it will on average generate the same revenue to an auctioneer as the revenue generated by any other auction with the same conditions. Surprisingly the conditions are not defined on the payment rules to the bidders but on the fact that the bidders do not bid below a reserve value defined by the auctioneer -, the winner is the one with the highest bid and there is a common equilibrium bid function used by all bidders. In this paper we verify such result using extensive simulation of a broad range of auctions and focus on the variability or fluctuation about the average that the results show. These fluctuations are observed and measured in two dimensions for each type of auction: as the number of auctions grows and as the number of bidders increases.

Authors

Fernando Beltrán, Natalia Santamaría

Speaker

Fernando Beltran

University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

ChakrabartiB

Title

Pricing Implications of Security Constrained Dispatch

Abstract

In an electric power system, operating reserves of "active power" are required to counteract frequency decay due to different contingencies, and regulation reserves to cover gradual frequency variation due to load fluctuation.  At present both categories are co-optimised, and traded, in the New Zealand electricity market.  But active and reactive power reserves can also be required to maintain a "security margin" to guard against load variations which could create "voltage collapse" situations.  This requirement can be modelled by imposing a constraint which maintains a load margin from the current operating point.  Here we recast the problem as a contingency constrained dispatch in which the critical issue is to be able to meet the load pattern which could emerge as a result of each contingency, given the resources and network available under that contingency condition.  A single security margin contingency can involve a load margin of different magnitude at different buses, and several constraints can be included in the model formulation.  The dual of this market-clearing optimisation model is analysed to determine the logical implications of such constraints on the prices for generation, reserve, and load.

Authors

Bhujanga B Chakrabarti and E Grant Read

Speaker

Bhujanga Chakrabarti

Transpower NZ Ltd

Full Paper

Abstract-only submission. You can view abstract [accepted]

YPP Entry?

No

 

PaperID

EhrgottM

Title

A Heuristic for the Discrete Biobjective p-Median Location Problem

Abstract

In this paper we develop a heuristic to find efficient solutions for the discrete p-median location problem with two obejctives. The heuristic combines  the vertex partitioning and exchange heuristics for the single-objective version of the problem. We show the results for an example problem. We evaluate some measures of the quality of the heuristic solution and give some indications of possible improvement.

Authors

Matthias Ehrgott, Bassy Tam

Speaker

Matthias Ehrgott

The University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

FouldsL

Title

Hay Harvesting Operations Scheduling Subject to Probabilistic Activity Duration and Machine Failure

Abstract

We report on a practical harvesting scenario arising in an agricultural context, based on data from commercial contracting enterprises that travel from farm to farm harvesting crops. The paper is an extension of previous work from the deterministic case to allow for the possibility of harvesting machine failure, and the fact that activity duration times are uncertain. The duration of each activity is dependent upon the combination of workers and machines allocated to it. Also, minimum time lags on the start of activities may be imposed. Also, a given sequence in which the farms are to be serviced, and inter-farm travel times must be taken into account. We report on a probabilistic dynamic programming formulation that was designed specifically for scenarios of the type described. The computational times experienced in solving practical instances of the formulation are encouraging. The results represent significant improvements over the schedules that were traditionally used in practice and it is believed that the formulation represents a useful addition to the farm crop harvesting contractor’s tool kit.

Authors

L R Foulds

Speaker

Les Foulds

University of Waikato

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

GalliganD

Title

Maritime Patrol Modelling and Planning

Abstract

This paper reports on recent modelling that has been done to address some defence questions on how to meet the need for maritime patrol about New Zealand. A variety of techniques  have been applied in this ongoing study, which include vehicle routing problems and agent-based modelling. Examples of the type of questions that need to be answered

and the methods applied will be presented.

Authors

David Galligan

Speaker

David Galligan

Defence Technology Agency

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

GordonM

Title

Determining Knot Points For Spline Regression Models

Abstract

Spline regression analysis is a method for evaluating changes in data sets, a data set is split into a number of segments and an approximation to each section is found. The boundary between segments is referred to as a "knot". As the number of knots in a data set increases there is a loss in data resolution. In some cases the locations of the knot points are known in advance and the only problem is modeling the different segments. However when the locations are not known there are three problems: the number of knot points, the locations of

these knots and the modeling of each segment. I wish to develop a method for identifying the number of knot points and their locations. I will model the data set as a piece-wise function and minimize the variance within each segment of the data. As the number of segments in the data set increases the quality of the data fit will improve. I will attempt to identify the optimal number of segments to minimize variance without causing excess losses of data resolution.

Authors

Matthew Gordon

Speaker

Matthew Gordon

Engineering Science, University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

Yes

 

PaperID

HasanMB

Title

Fishing Trawler Scheduling for Integrated Fisheries

Abstract

This paper analyzes fishing trawler scheduling for integrated fisheries. Fishing trawler scheduling is a well known problem in fishery industry. The scheduling problems faced by the fishery industry are more complicated than the traditional merchant trawler scheduling problems. The factors that influence the scheduling of fishing vessels include catching capacity, processing capacity, optimal cost etc. The most important is that the fishing trawler is a combination of a catching unit, transport unit and processing unit. We present a linear programming (LP) problem to schedule the fishing trawlers of an integrated commercial fishery. We demonstrated our method using a numerical example.

Authors

Mohammad Babul Hasan

Speaker

Mohammad Babul Hasan

Dept. of Management, University of Canterbury

Full Paper

You can view paper [accepted]

YPP Entry?

Yes

 

PaperID

HuangC

Title

Influence Diagrams: A Rough Set Approach

Abstract

Influence diagrams are a graphical technique for a decision problem under uncertainty. However, the numerical models of the influence diagrams used to be limited in probability distributions. When the dependencies or parameters are more imprecise, how to reason from the imprecise information becomes core issue to evaluate influence diagrams effectively.

The rough sets theory was first introduced by Pawlak as a tool dealing with uncertainty and vagueness in decision-making. The probabilistic approaches have been applied to the rough set theory. The main objectives of this paper are (1) to describe how rough sets theory can be applied to express the dependency in influence diagrams, and (2) to develop a rough set-based influence diagrams which combine the rough set decision rules with the graphical structure of the influence diagrams.

Authors

Chia-Hui Huang, Han-Lin Li

Speaker

Chia-Hui Huang

Institute of Information Management, NCTU

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

KaoHY

Title

Fuzzy Reasoning and Optimization Based on a Generalized Bayesian Network

Abstract

Bayesian networks have been widely used as the knowledge bases with uncertainty. However, in most literatures, the uncertainty measure in Bayesian networks are limited in crisp probability distributions and crisp variables, which restricts the practical usefulness of Bayesian networks when incomplete knowledge or linguistic vagueness is involved in the reasoning system. This study intends to develop a generalized Bayesian network in which the fuzzy variables, crisp variables, fuzzy parameters and crisp parameters can be considered. Based on the generalized Bayesian network, the fuzzy reasoning model for prediction, diagnosis, and optimization, can be designed. This study also develops the algorithms for fuzzy reasoning and optimization. The fuzzy reasoning and optimization models based on the generalized Bayesian network will be applied to supply chain management.

Authors

Han-Ying Kao

Speaker

Han-Ying Kao

Department of Marketing and Distribution, Hsuan Chuang University

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

KirkpatrickS

Title

Better Base Locations for the Melbourne Ambulance Service

Abstract

The Melbourne Ambulance Service uses a simulation system, called Siren, to aid their system planning. One of their questions is where to locate their ambulance bases to improve response times to calls. The aim of this project is to develop a new search algorithm that uses Siren to determine better base locations.  Initially several integer programming models were solved using AMPL and the resulting scenarios were simulated in Siren.  The many assumptions involved in these models limited their practical application and thus most attention was given to the development of a Local Search method.  This procedure works with an existing set of base locations and exploits the realism of Siren's simulations to improve solution quality.  The Local Search examines the effect of moving bases to each node in the transport network within a specified radius from their original location.  To determine the effect of these movements a Score Function was developed to provide a score for each test location.  The method works by considering each base in turn and moving it to the valid test node that provides the highest score.  The Local Search produced significant improvements in system performance in a single iteration.

Authors

Sarah Kirkpatrick

Speaker

Sarah Kirkpatrick

University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

Yes

 

PaperID

LaurenM

Title

Applications of fractals to weather and electricity hedging decisions

Abstract

In recent years many businesses facing monetary risk due to the fickle nature of weather have attempted to mitigate this risk by using financial hedging instruments.  Similar instruments have also been applied (more so in Australia than New Zealand) to commodities whose price may be easily dependent on weather phenomena, such as electricity. In this case, hedging occurs either against weather-based metrics, such as the amount of rainfall, or against the price of electricity itself. This paper discusses the statistical character of weather metrics and electricity prices and demonstrates how it is possible, but more importantly, highly appropriate to assess the degree of risk for both using fractal concepts. From this, fair values for hedging contracts can be obtained.

Authors

Michael Lauren

Speaker

Michael Lauren

Defence Technology Agency

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

LeylandG

Title

Stochastic Optimization: Rowing to Barbados

Abstract

On the 19th of October 2003, 16 rowing boats set out from La Gomera in the Canary Islands, and headed for Port Charles in Barbados, some two and a half thousand nautical miles away. 40 days, 5 hours and 31 minutes later, Team Holiday Shoppe---Kevin Biggar and Jamie Fitzgerald---crossed the finish line, winning the race and setting a new world record. A key ingredient in the team's success was a set of routing charts developed by the authors using stochastic optimization techniques. These charts not only helped the Team with their strategic and day-to-day route planning, they also proved vital in the Team's successful defense against the second-place finisher's official protest, which complained that Team Holiday Shoppe was too fast. This paper tells the story of Team Holiday Shoppe's journey, and the role that stochastic optimization played in the Team's success.

Authors

Geoff Leyland, Andy Philpott

Speaker

Geoff Leyland

Stochastic Optimization Limited

Full Paper

Abstract-only submission.

YPP Entry?

No

 

PaperID

MadsenO

Title

The Vehicle Routing Problem with Time Windows - Survey and Recent Developments

Abstract

The use of Operations Research in forest decision making has been very successful for several decades. From strategic decisions to detailed short term operations in harvesting and transportation,  applications have been used in many countries. Recently spatial problems derived from environmental considerations have attracted attention, leading to hard combinatorial problems. Other areas of problems come from uncertainty considerations, in particular prices and consideration of fires and other disturbances. The integration of the forest supply chain is also becoming a topic of interest. In general, the methodologies used concentrate on Linear Programming (LP), Mixed Integer LP, Heuristics, Metaheuristics and Simulation. In this talk we discuss what has been accomplished in these areas, but mostly focus on open problems which have not been solved yet. The challenges are in how to model reality realistically and in solution algorithms to solve difficult problems.

Authors

Oli Madsen

Speaker

Oli Madsen

CTT-  Centre for Traffic and Transport, Technical University of Denmark

Full Paper

Abstract-only submission.

YPP Entry?

No

 

PaperID

MasonA

Title

Elastic Constraint Branching, the Wedelin Lagrangian Heuristic and Staff Scheduling

Abstract

The Wedelin algorithm is a Lagrangian based heuristic that is being successfully used by Carmen Systems to solve large crew pairing problems within the airline industry. We extend the Wedelin approach by developing an implementation for personnel scheduling problems (also termed staff rostering problems) that exploits the special structure of these problems. We also introduce elastic constraint branching with the twin aims of improving the performance of our new approach and making it more column generation friendly. Numerical results show that our approach can outperform the commercial solver Cplex on difficult commercial rostering problems.

Authors

Andrew Mason

Speaker

Andrew Mason

University of Auckland

Full Paper

Abstract-only submission.

YPP Entry?

No

 

PaperID

McNaughtonA

Title

Recent progress on the area restriction problem of forest harvesting

Abstract

It is required to optimize timber output subject to sustainability requirements along with a specified maximum clearfell area. A report will be given outlining some significant recent advances

Authors

Alastair McNaughton

Speaker

Alastair McNaughton

University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

MitchellS

Title

Yield Frontier Analysis of Forest Inventory

Abstract

Standing inventory analysis allows a forestry to predict the yield (volume by log-type)of a section of forest before the trees have been felled. Current standing inventory tools such as MARVL and ATLAS Cruiser predict yield by simulating bucking, which will take place when the trees are harvested. These tools produce a point estimate of the yield for given input data.

 

In practice, when the trees are bucked at harvest, the harvesting crew can make different decisions than those predicted by the analysis. This paper presents a method of standing inventory analysis that seeks to characterise all possible yields for a section of forest.

Authors

Stuart Mitchell and K. Kim

Speaker

Stuart Mitchell

Dept of Engineering Science, University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

OSullivanM

Title

Connecting Efficient Knapsacks: Experiments with the Equally-Weighted Bi-Criteria Knapsack Problem

Abstract

There are many applications of the classical knapsack problem in which the weight of the items being considered for the knapsack are identical, e.g., selecting successful applicants for grants, awarding scholarships to students, etc. Often there are multiple criteria for selecting items to be placed in the knapsack. This paper presents two new algorithms for finding efficient bi-criteria knapsacks when selecting from equally-weighted items. The success of these two algorithms depend on unproven characteristics of the set of efficient knapsacks. If there is a path of adjacent swaps between any two efficients knapsack that remains within the set of efficient knapsacks, then the first algorithm will find the complete set of efficient knapsacks. The second algorithm requires an even stronger condition to guarantee finding the complete set of efficient knapsacks, namely that the path of adjacent swaps not only remains within the set of efficient knapsacks, but that the swaps always increase one objective while decreasing the other. We compare the two algorithms with a bi-criteria shortest-path approach (that is known to find the set of efficient knapsacks) and show that the two algorithms produce the same solutions for a large test set of randomised problems. Our results suggest an interesting structure for the set of efficient knapsacks that may be exploited to find faster solution methods.

Authors

Cameron Walker and Michael O’Sullivan

Speaker

Michael O'Sullivan

University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

PatelS

Title

Locomotive Allocation for Toll NZ

Abstract

A Locomotive is defined as a self-propelled vehicle for pulling freight or passenger cars along railroad tracks. A Locomotive is considered separate from the train itself which comprises carriages or wagons. Locomotives are required to pull trains to their destination. The trains the locomotives are scheduled to take is known as the 'Locomotive Plan' and it ensures that every train has the correct number of locomotives allocated. However, unforeseen events, such as delays or breakdowns, occur frequently which disrupt the Locomotive Plan. The objective of this project was to develop an optimization model for Toll New Zealand (NZ) that re-allocates trains to locomotives due to these daily unforeseen events. The model is to make alterations to the plan ensuring that the correct number of locomotives is assigned to each train and that the alterations are optimal according to preferences indicated by Toll NZ. The optimization model and solution process involved 'a priori' column generation and zero-one integer programming.

 

The model was successfully able to reallocate the locomotives to the trains for a range of possible disruptions at varying times during the week in the Electric Route (coves the Central North Island). Results showed indications towards the model producing a better re-allocation, in terms of the preferred changes, than what is done manually by a Locomotive Controller at Toll NZ.

Authors

Sanjay Patel

Speaker

Sanjay Patel

Engineering Science, University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

Yes

 

PaperID

PaynterJ

Title

Academic Plagiarism: An Analysis of Current Technological Issues

Abstract

Current research indicates that academic plagiarism is a serious problem for tertiary institutions worldwide. Students will resort to many methods of plagiarism in order to escape the 'hard work' required to complete assignments legitimately. The all-too-often excuse used is somewhere along the lines of "I didn't have the time". Results from an online survey targeted at New Zealand tertiary students indicates that approximately 20% of students have plagiarised to some extent in their academic careers, and 69% admit to knowing of others who have plagiarised.

 

Turnitin.com is the world's most popular Electronic Plagiarism Detector (EPD), being used in over 51 countries around the world. Its Document Source Analysis methodology provides an in-depth analysis of a particular document. This cross-references the document with both billions of Internet resources and previously submitted documents. It is a secure service that has received mainly positive attitudes from survey respondents.

 

Turnitin.com is a good method of detecting plagiarism, but cannot be expected to fight alone in the prevention battle. It is recommended that tertiary institutions educate students about how their work will be assessed and the potential penalties imposed if submitted work is detected as plagiarism.

Authors

Conor J. Mills and John Paynter

Speaker

John Paynter

University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

PhilpottA

Title

Single-Unit Commitment Problems in Electricity Pool Markets

Abstract

We consider an electricity generator with a single generating unit making offers of energy into an electricity pool market. For a given time period, it must submit to the pool system operator a supply function, typically consisting of a fixed number of quantities of energy and prices at which it wants these dispatched. The amount of dispatch depends on the supply function offered along with the offers of the other generators and market demand, both of which are random, but are assumed have known market distribution functions. The generator seeks an offer stack that maximizes its expected profit accounting for start-up and shut-down costs and/or ramp rates. We describe an optimization procedure based on dynamic programming that can be used to construct optimal offer stacks in successive time periods over a fixed planning horizon (typically a single trading day)

Authors

Andy Philpott

Speaker

Andy Philpott

University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

PriestleyJ

Title

Solution of the Airline ToD Problem using Severely Limited Subsequence

Abstract

The minimum-cost Tour-of-Duty (ToD) Problem is extremely well known in the airline industry. ToDs are typically constructed by considering the complete set of subsequent flights (subsequences) that may be operated by a crewmember after they operate a given flight. The column dimension of the resulting model is generally extremely large, so column generation is used.

Limiting the set of subsequences that may be considered is beneficial in terms of reducing the number of columns and improving the integer properties of the LP relaxation. The downside to this approach is that solution quality can be compromised if the limited subsequence set is not carefully selected.

We investigate how to dynamically alter the limited subsequence set throughout the solution process so as not to compromise solution quality, while still maintaining the benefits of limiting subsequence. The use of dual stabilisation and constraint aggregation techniques in this context will be discussed. We present comparative results from real airline problems to demonstrate the efficacy of this technique.

Authors

James Priestley

Speaker

James Priestley

Auckland University

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

PritchardG

Title

HERO (Hydro-electric reservoir optimization)

Abstract

For the operator of a hydro-electric reservoir in a pool electricity market, the optimal stack to offer in each trading period over a planning horizon can be computed using dynamic programming. However, the market trading period (usually 1 hour or less) may be much shorter than the inherent time scale of the reservoir (often many months). We devise a model to handle these two inherently different timescales. In this model, the decision made at the beginning of each stage consists of a target mean and variance of the water release in the coming stage. This decomposes the problem into inter-stage and intra-stage subproblems.

Authors

G. Pritchard, G. Zakeri, A. Philpott

Speaker

Geoffrey Pritchard

University of Auckland

Full Paper

Abstract-only submission.

YPP Entry?

No

 

PaperID

RaffenspergerJF

Title

Application of column splitting to the travelling salesman problem.

Abstract

We show how to improve the Held-Karp lower bound on the Travelling Salesman Problem, by applying column splitting. The subproblems are the 1-tree and the 2-matching problems. We show how to find the new bound with both subgradient optimisation and column generation. We prove that the lower bounds on these models are at least as good as the standard Held-Karp lower bound, and we give examples with strictly better bounds.

Authors

John F. Raffensperger

Speaker

John Raffensperger

University of Canterbury

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

ReadE

Title

A Theoretical Framework for Zonal Electricity Markets

Abstract

Debate has raged, in various parts of the world, about the relative merits of 'nodal' vs 'zonal' electricity markets.  Proponents of each proclaim that their favoured model is simpler, more natural, and more effective than the other.  In reality, no one market model ideally fits the widely varying circumstances of markets around the world, and some kind of compromise, or hybrid may often be the best solution.  But the task of comparing, let alone integrating, these market concepts has been hampered by the fact that, while an elegant theory has been developed for nodal markets, current zonal market  designs are essentially ad hoc constructions, whose theoretical structure is not necessarily well understood.  Here we report on a theoretical framework developed in the course of a major review of the regional structure of the Australian national electricity market.  We show that, with care, a Linear Programming 'market-clearing' formulation can be developed for a zonal market which gives identical dispatch outcomes to those for a nodal market.  Further, the shadow prices corresponding to the optimal market-clearing solution can be interpreted to provide pricing signals equivalent to those from a nodal market, while contracting instruments can be devised which are essentially equivalent to the 'Financial Transmission Rights' often employed in nodal markets.  This means that a consistently designed and implemented zonal market can yield equivalent outcomes to a nodal market, and that much of the theory developed for nodal markets can also be applied in this context.  But we also show that the framework developed here generalises the nodal market framework in ways which can enhance the design and performance of nodal electricity markets, particularly in situations where constraints must be placed on variables other than "nett nodal injections".

Authors

E. Grant Read, Deb Chattopadhyay

Speaker

E Grant Read

Canterbury University

Full Paper

Abstract-only submission. You can view abstract [accepted]

YPP Entry?

No

 

PaperID

SheffieldJ1_

Title

Cooking the Evidence - A Recipe for Discursive Truth

Abstract

How is operations research embedded in a broader organizational and social contexts? How does OR knowledge relate to personal freedom and social justice and objective truth? What’s the bottom line for institutions involved in technology-assisted strategic planning?

This paper proposes that a new notion of discursive truth christened Habermasian Inquiring Systems may assist critical operations researchers in facilitating practical knowledge of multi-party, multi-issue situations such as these.

The concept of a Habermasian Inquiring System (HIS) is developed from the work by Churchman, Habermas and Ulrich. HIS combine elements of general systems concepts, Churchman’s inquiring systems, validity claims based on Habermas theory of communicative action, and Ulrich’s critical systems thinking.

The purpose of a Habermasian Inquiring System is to develop and test the coherence among intentions and outcomes via the gold standard of Habermasian ideal speech. The latter is based on social actor’s validity claims for technical, social and personal knowledge. Technical knowledge is validated by objective truth. Organisational and social knowledge is validated by rightness. Personal knowledge is validated by truthfulness and sincerity.

This paper also investigates the degree to which Habermasian Inquiring Systems generate mid-range theories useful in the design and evaluation of critical operations research interventions that involve conflicting values. A particular HIS known as the Validity Claims or VC-Model is developed for application within strategic planning. A case example of the application of the VC-Model is reported in the companion paper "Eating your own cooking: Evaluating discursive truth".

It is concluded that Habermasian Inquiring Systems provide researchers with an accessible theory of knowledge of practical value in integrating the multiple perspectives required in critical operations research.

Authors

Jim Sheffield

Speaker

Jim Sheffield

University of Auckland

Full Paper

Abstract-only submission. You can view abstract [accepted]

YPP Entry?

No

 

PaperID

SheffieldJ2_

Title

Eating Your Own Cooking - Evaluating Discursive Truth

Abstract

How is operations research embedded in a broader organizational and social contexts? How does OR knowledge relate to personal freedom and social justice and objective truth? What’s the bottom line for institutions involved in technology-assisted strategic planning?

This paper proposes that a new notion of discursive truth christened Habermasian Inquiring Systems may assist critical operations researchers in facilitating practical knowledge of multi-party, multi-issue situations such as these.

This paper demonstrates the application of a particular Habermasian Inquiring System, the VC or ‘Validity Claims’ Model (Sheffield 2004, see also companion paper ‘Cooking the evidence: A recipe for discursive truth), to evaluate value conflict in strategic planning.

The technology-enabled intervention is intended to facilitate the strategic evaluation of a comprehensive urban plan. The stakeholders are planners representing territorial authorities and the Auckland Regional Council. The study focuses on their participation in a facilitated, electronically-supported meeting.

The action research intervention is envisioned as electronically supported discourses, designed and evaluated against a gold standard of ideal speech in a perfect communication environment. These discourses are examined for coherence or strategic fit among strategic planning intentions and outcomes in each of Habermas knowledge worlds - personal, social and technical.

Support is found for the propositions that Habermasian Inquiring System (HIS) assist stakeholders to interpret (a) anomalous preferences (b) a critical incident. It is suggested that Habermasian Inquiring Systems such as the VC (Validity Claims) Model provide mid-range theories widely useful in areas requiring a critical appreciation of value conflict. Application areas include collaborative learning, research methods, strategic planning, policy development, and organizational change.

Authors

Jim Sheffield

Speaker

Jim Sheffield

University of Auckland

Full Paper

Abstract-only submission. You can view abstract [accepted]

YPP Entry?

No

 

PaperID

SinghK

Title

Column Generation for Capacity-Expansion Planning of Electricity Distribution Networks

Abstract

We present a stochastic model for capacity-expansion planning of electricity distribution networks subject to uncertain demand (CEP).

 

We formulate CEP as a multi-stage stochastic mixed-integer program with scenario-tree representations of uncertainty.  At each node of the scenario-tree, the model involves decisions for capacity-expansion, operating configuration, and power flows.

 

We present a 'super-arc' representation of the network that significantly reduces the number of binary variables and provides a tighter linear-programming relaxation. Dantzig-Wolfe decomposition leads to (a) a master problem containing binary capacity-expansion and high-level operating decisions; and (b) column-generating subproblems which are mixed-integer programs representing single-period, deterministic capacity-expansion models. Solution times for column generation applied to these models are significantly better than those for CPLEX applied to the original stochastic integer programs.

Authors

Kavinesh Singh

Speaker

Kavinesh Singh

Dept of Engineering Science, University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

Yes

 

PaperID

TamB

Title

Multi-Criteria Methods for Unit Crewing in the Airline Tour-of-Duty Planning Problem

Abstract

In crew scheduling, the aim of unit crewing is to keep crew members from different crew ranks together to operate a sequence of flights as a unit as much as possible. When the crew members of different ranks operate together for a sequence of flights, the sequence is unit crewed. A unit crewed solution is considered to be more robust in the sense that aircraft departures are less likely to be delayed due to waiting for a member of a crew. It is usual to solve the ToD planning problem for each crew rank as a separate problem. To maximize unit crewing in the Tour-of-Duty (ToD) solution, a set of ToDs for one rank in terms of minimal cost is constructed and a weighted sum method is used to penalize any tour for the other rank that does not contain the same flight sequence. We believe that a model that solves the two ToD planning problems simultaneously can improve the number of unit crewing sequences without increasing the total crew cost. We use the number of unit crewed flight as a second objective and a branch and bound strategy to ensure that as many unit crewing sequences can be selected as possible.

Authors

Bassy Tam

Speaker

Bassy Tam

Engineering Science, University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

ThompsonT

Title

Optimal Core-Edge Storage Area Network Design

Abstract

Storage Area Networks (SANs) have become a unique solution for information management as the integrity and reliability of data storage have become increasingly important within industry. A SAN manages the transfer of data from sets of servers and/or clients to centralized data storage through an internal fabric comprising of switches, hubs and links. Core-Edge is a reference topology widely used in the layout of SAN fabric. We study the problem of mapping this SAN fabric for specific demands, utilizing Core-Edge. We present an integer programming (IP) model to this restricted network design problem and offer preliminary results with comparison to commercial standards for fabric design.

Authors

Timothy Thompson

Speaker

Timothy Thompson

Engineering Science, University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

Yes

 

PaperID

TrietschD

Title

Stochastic Economic Balance Principles for Project Planning: Feeding Buffers, Crashing, and Sequencing

Abstract

A feeding buffer is created by starting a project activity before its latest start time to provide adequate protection against project delay. Assuming linear costs for starting activities earlier and a linear project delay penalty, early optimization models for project buffers addressed simple project network structures and invoked the newsvendor model as an approximation. Recently, it had been shown that when the gating activities precede the only stochastic elements in a project, then there exists an exact generalization of the newsvendor model that characterizes the optimal feeding buffers. This insight led to an effective and efficient solution approach by simulation. We show that this result also holds when stochastic elements exist elsewhere within the project. Therefore, the same simulation approach applies. This yields practically optimal feeding buffers even when it is impossible to compute the completion time distribution. The same insights can be used for sequencing decisions and, with a slight enhancement, for optimal crashing decisions. The method works even when activities are statistically dependent.

Authors

Dan Trietsch

Speaker

Dan Trietsch

University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

VelasquezR

Title

Solving a large-scale dynamic facility location problem with variable neighbourhood and token ring search

Abstract

Dynamic facility location is a problem within Supply Chain Management that many big organizations are faced with.  Companies need to know which plants and distribution centers should operate.  Which products should be produced in which plant and when, in time, should products be stored in warehouses, so that the client receives them in time.  In this presentation, a multiple neighborhood search approach is applied to solve a large scale dynamic facility location mixed integer model.  The models’ aspects are: dynamic planning horizon, generic supply chain network structure, external supply and inventory opportunity for products as well as their distribution, storage limitation and facility configuration and the availability of capital for investments on the relocation of these.  To solve this problem, the heuristic develops an initial feasible solution prior to improving it through a multi-neighborhood search.  Moreover, finding a feasible solution includes an iterative integer rounding of binary variables, as well as making use of the problems’ characteristic and a randomized algorithm.  Likewise, the multi-neighborhood search considers four different neighborhoods and three algorithms.  These techniques are used to solve randomly generated problems with up to three echelons in the supply chain network, holding in average around 3 000 constraints, 260 000 continuous variables and 120 binary variables.  The heuristic ran on a Pentium III, 2.6 GHz processor in average for a little under three minutes and returned a value with a deviation of a little under 0.5% from the optimal solution.

Authors

R. Velasquez and M. T. Melo

Speaker

Rafael Velasquez

Technical University of Kaiserslautern, Germany

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

WatererH

Title

Solving the Classroom Assignment Problem Using Integer Programming

Abstract

In the classroom assignment problem, classes, which meet at different time intervals, must be assigned rooms. Each class must be assigned a suitable room and no two classes can meet simultaneously in the same room. This classic problem is in no way restricted to teaching institutions and is typically solved manually or by commercially available software packages employing heuristics. We introduce a novel integer programming model of the problem and present computational results of our experiences in using this model to solve this problem at The University of Auckland.

Authors

H. Waterer and D.M. Ryan

Speaker

Hamish Waterer

Engineering Science, University of Auckland

Full Paper

Abstract-only submission.

YPP Entry?

No

 

PaperID

WeintraubA

Title

Open Challenges In Forest Modeling And Algorithms: Applications And Methodology

Abstract

The use of Operations Research in forest decision making has been very successful for several decades. From strategic decisions to detailed short term operations in harvesting and transportation,  applications have been used in many countries. Recently spatial problems derived from environmental considerations have attracted attention, leading to hard combinatorial problems. Other areas of problems come from uncertainty considerations, in particular prices and consideration of fires and other disturbances. The integration of the forest supply chain is also becoming a topic of interest. In general, the methodologies used concentrate on Linear Programming (LP), Mixed Integer LP, Heuristics, Metaheuristics and Simulation. In this talk we discuss what has been accomplished in these areas, but mostly focus on open problems which have not been solved yet. These challenges  appear in how to model reality realistically and in solution algorithms to difficult problems.

Authors

Andres Weintraub

Speaker

Andres Weintraub

University of Chile

Full Paper

Abstract-only submission.

YPP Entry?

No

 

PaperID

WinzI

Title

A decision support system for radiation therapy treatment planning

Abstract

This paper reports on the use of an interactive decision support system to generate treatment plans for external beam radiation therapy. When evaluating treatment plans, radiation therapists are often faced with situations where overdose of healthy organs or underdose of the tumour cannot be avoided. Current treatment planning software incorporate trial-and-error methods to change treatment parameters and re-optimise in order to generate a better treatment plan. However, this process is not only inefficient, but also does not yield information on how radiation doses depend on the structure site and the planning parameters. Here a multi-criteria-based optimisation model is presented, which is used to calculate a large number of efficient treatment plans.  These are stored in a database and accessed for evaluation by the decision support system CARINA. The navigation among those solutions and the information that is provided to guide the user in this process are described. Of special importance is sensitivity analysis, which extracts dose dependence information for the tumour and healthy organs from the efficient treatment plans. As a result, plan quality is improved by finding advantageous trade-offs in competing treatment plans. Additionally, the common trial-and-error process is avoided and effectiveness in treatment planning is increased.

Authors

Ines Winz

Speaker

Ines Winz

Departrment of Engineering Science, UoA

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

WoodK

Title

The BEST Algorithm for Stochastic Mixed-Integer Programs:  Bootstrapping to Make It Better

Abstract

We present the 'BEST algorithm' for solving two-stage stochastic programs with discrete first-stage variables and with continuous and/or discrete second-stage variables (SMIPs).  In its simplest form, for a minimizing SMIP, the algorithm incorporates these steps:

(1) Bound: Compute a deterministic upper bound on the optimal objective value, and devise a deterministic lower-bounding function;

(2) Enumerate: Enumerate all (first-stage) solutions whose lower bounds do not exceed the upper bound and thus may be optimal;

(3) Sample:  For each potentially optimal first-stage solution, use a Monte-Carlo procedure to sample outcomes (i.e., sample second-stage random variables and solve the resulting second-stage optimization problems); and

(4) Test:  Test the samples for quality and choose the solution or solutions that are 'likely' to be optimal.

 

Testing can involve any number of 'classical' subset-selection and screening methods that provide statistical assurances about solution quality.  However, we have recently found bootstrapping to be a simpler and more powerful approach.  We describe this approach and provide computational results for a stochastic facility-location problem.

Authors

Susan M. Sanchez, R. Kevin Wood

Speaker

Kevin Wood

Naval Postgraduate School

Full Paper

Abstract-only submission.

YPP Entry?

No

 

PaperID

WoodL

Title

The Impact of JIT Replenishment and Road Traffic Congestion on Distribution Costs

Abstract

The Auckland region in New Zealand (NZ) accounts for a third of NZ’s population but only about 2% of NZ’s landmass. The relative dominance of the Auckland region of New Zealand is a rare instance of polarized population distribution that is paralleled only by a handful of other urban regions such as Athens in Greece and Buenos Aires in Argentina. (Athens has roughly 30% of Greece’s population over a landmass of 0.25%, while Gran (Greater) Buenos Aires has a third of Argentina’s population over a landmass of 0.13%.) The dominance of Auckland in New Zealand is expected to only increase over time as it experiences over half the annual growth in NZ’s population. As a consequence, road traffic congestion in the Auckland region is considered a national problem: in terms of lost income, time, and pollution, congestion is estimated to cost the NZ economy nearly $1 billion NZ dollars per year - about 1% of NZ’s GDP. (This is an extrapolation from a research report that was prepared in 1997 by the consultancy Ernst and Young which at the time estimated the cost to be NZ$ 775 million.)

 

In exploratory case study research conducted by one of the authors and his collaborators, a theme that emerged was the compounding effect of shrinking consignment sizes on distribution costs.  Thus, even as congestion levels are rising, customers are replenishing product on a Just-In-Time basis which further drives up distribution costs.  Since the isolation of the impact of congestion on logistics costs is not always easy, organisations are tempted to impute rising logistics costs to congestion more so than is justifiable.

 

In this paper, we describe the use of continuous approximation models that are adapted from Daganzo and other researchers to capture the relative impact of congestion on distribution costs vis-à-vis shrinking consignment sizes.  We report the results on real-world data drawn from a NZ manufacturer and distributor.

Authors

Lincoln Wood and Jay Sankaran

Speaker

Lincoln Wood

University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

No

 

PaperID

YoungM

Title

Improved Column Generation for the Air New Zealand Tours-of-Duty Problem

Abstract

This research proposes a new column generation approach to be used within the solution process to the minimum-cost Tours-of-Duty (ToD) problem for Air New Zealand. The objective of ToD problems is to find the minimum cost combination of ToDs such that each flight in the flight schedule will be crewed. Contractual rules and regulations governing the construction of ToDs make the column generation problem very computationally complex. The new approach aims improve the efficiency of generating new ToDs by reducing the number of computationally expensive rules checking calls needed to construct legal ToDs. At present a constrained shortest path approach is implemented to solve the column generation problem where each rule is checked at each step of the construction of a ToD. We propose an approach to column generation where infeasibilities during construction can be detected by comparing the current partial ToD with partial infeasible ToDs already enumerated. Preprocessing techniques are used to detect if partial ToDs can be completed feasibly and selected rules are to be checked only after constructing a ToD.

Authors

Martin Young

Speaker

Martin Young

Engineering Science, University of Auckland

Full Paper

You can view paper [accepted]

YPP Entry?

Yes

 

PaperID

ZakeriG

Title

Matlab splines utilized from GAMS

Abstract

Many applications in industries such as car manufacturing or petrolium industries entail optimization of black box functions. Typically these functions are very expensive to evaluate and it is desirable to approximate them using very few evaluations. A useful tool for such an approximation is a spline.   In this paper we describe how black box functions approximated by matlab splines can be optimized subject to constraints with the aid of GAMS software.

Authors

Michael Ferris and Golbon Zakeri

Speaker

Golbon Zakeri

University of Auckland

Full Paper

Abstract-only submission.

YPP Entry?

No