• No results found

Large-Scale Process Simulation True Triaxial Rig .1 Introduction

TEST EQUIPMENT AND CALIBRATION

4.2 EXPERIMENTAL APPARATUS

4.2.3 Large-Scale Process Simulation True Triaxial Rig .1 Introduction

volumetric strains. As confirmation, a conventional method of specimen volume change measurement was used. This consisted of a 100 mm diameter graduated burette connected to the specimen top drainage lines. The capacity of the burette was 4000 ml and reading could be made to an accuracy of 7.85 ml. Measurements taken using both methods were compared for crosschecking.

4.2.3 Large-Scale Process Simulation True Triaxial Rig

Chapter 4: Test equipment and calibration

that, although they allowed surface movement of the ballast (e.g. crib ballast movement), lateral flow of the ballast (transverse to and along the rail) was not permitted. It was concluded that better modelling of field conditions would be provided by a device that permitted the application and control of different confining pressures on the two orthogonal horizontal directions and also allowed both surface movement and lateral ballast displacement under vertical (traffic) load. Such equipment was designed and constructed specially for the purpose of this research.

A general view of the process simulation true triaxial rig (PSTTR) used in this study is presented in Figure 42. The apparatus was built to accommodate prismatic test specimens (600×800×600 mm) and to subject them to three mutually orthogonal independent stresses. It also permitted lateral spreading of ballast due to the unrestrained sides of the box. The PSTTR comprised four main parts, viz., the prismoidal triaxial cell, the axial loading unit, the confining pressure control system, and the horizontal and vertical displacement monitoring devices. A schematic diagram of the test rig apparatus is presented in Figure 43. The construction drawings are presented in Appendix B.

4.2.3.2 Prismoidal Triaxial Cell

A strong and rigid steel frame was fitted to a 20 mm steel base plate, as shown in Figs.

43 and 44. The high yield prismoidal steel box (without bottom and top) was placed inside the frame and supported on the displacement system. There was about a 1 mm gap between the vertical walls and the base plate, which permitted free movement of the vertical walls when subjected to a horizontal force. The cell could accommodate specimens with dimensions of 600×800×600 mm. However, these dimensions could be

Figure 42. General view of the process simulation true triaxial rig

600

150300150

Capping Ballast

East/West wall North

Side South

Side

Embedded settlement devices Rail segment

Timber sleeper segment

Pressure/Load cells

Dynamic actuator

Displaceable walls

Rubber mat

Linear bearing system Load distributing plate

Figure 43. Section through the process simulation true triaxial rig ready for testing

Chapter 4: Test equipment and calibration

(a)

(b)

Figure 44. Schematic diagram of the process simulation true triaxial rig: (a) plan view; (b) section Y-Y

decreased or increased depending upon test requirements. For the vertical wall plate, a 10 mm thickness was selected as this provided a relatively light but adequately stiff boundary. Stiffness was checked using the elastic theory of thin plates, which predicted a deflection of less than 0.01 mm (equivalent to a 0.003 % strain on the specimen) at the maximum expected confining pressure. Nevertheless, in order to eliminate any non-uniform distribution of the confining pressure, the middle section of each wall was stiffened by fitting a 300×300×10 mm thick plate centred at the point of load application. This ensured a uniform distribution of confining pressure over the entire thickness of load bearing ballast (300 mm below sleeper).

For ease of reference and monitoring, the box walls were named from the orientation of the device within the laboratory (Figs. 43 and 44). Accordingly, the walls parallel to the sleeper direction were named North Wall (NW) and South Wall (SW), respectively, and the walls perpendicular to the sleeper direction were named East Wall (EW) and West Wall (WW), respectively.

The most critical part of the test box design was the provision of walls that could displace independently. The connections between the walls permitted this to occur. For the displacement system it was decided to use linear-bearings mounted on a steel rod, mechanisms that permitted frictionless wall displacement inwards and outwards.

Linear-bearings were placed near the corners of each wall, four on each wall, hence sixteen in all. In order to permit the movement of the E and W walls, and to keep the specimen properly confined inside the walls, it was necessary to design the N and S walls to extend up to 100 mm on each side (Fig. 44). Various options were considered to solve this problem, with the simplest and the most effective being selected. Two free

Chapter 4: Test equipment and calibration

rotating strips (50 mm width, with trimmed edges) were inserted and connected to the N and S wall plates towards their boundaries.

At the test box corners, eight movable connections were provided, two at each corner, 100 mm from the top and bottom of the box. These consisted of a 50 mm diameter wheel rolling along a rail fitted on the E and W walls at specified heights (Figs. 43 and 44). This simple system ensured displacement of the N and S walls edges simultaneously with the displacement of E and W walls. In addition, these connections provided lateral support to minimize tilting of the walls. The allowable lateral displacement in N-S direction (X-X or σ2 direction) of a wall was ± 50 mm, equivalent to a ±16.7 % deformation, whereas in E-W direction (Y-Y or σ3 direction) each wall could displace ± 50 mm, representing a ±12.5 % deformation (Fig. 44). The direction of the two principal stresses only requires re-definition to alter the direction.

4.2.3.3 Axial Loading Unit

The axial load was provided by a dynamic actuator (Dartec) with a maximum capacity of 10 t. The frequency of the applied load depended on its amplitude, allowing up to a 50 Hz frequency for a very small amplitude. The steel ram (100 mm diameter) of the dynamic actuator transmitted the major principal stress, σ1, to the ballast specimen through a segment of timber sleeper (Figs. 42 and 43). The principal features of the vertical loading unit are presented in the following sections.

a. Hydraulic Power Unit

The dynamic actuator was driven by a hydraulic power unit. This had a control console close to the testing device and a remote oil pump. The hydraulic power unit supplied

267 l/m of Turbine Oil (No. 140) at a steady pressure of 2.75 MPa in 50 mm diameter lines and had a storage capacity of 100 litres. The pump was of a constant pressure variable volume type, incorporating a swash plate. The electric motor that drove the main hydraulic oil pump had automatically controlled star-delta starting. Its control circuit allowed for both local operation and remote operation from the console.

The hydraulic power unit was water-cooled. The level of the cooling water was electro-magnetically sensed. To avoid overheating and damage the hydraulic power unit would automatically shut down in the event of cooling water depletion. The temperature control system incorporated a thermostat and solenoid control switches. The normal setting of the thermostat was 40°C, although the system could be operated at temperatures up to 60°C.

b. Hydraulic Power Circuit

The hydraulic pump delivered oil, via a filter and a non-return check valve, to a delivery manifold. An accumulator, fitted with a solenoid operated relief valve, was connected to the delivery manifold. When necessary, the relief valve returned oil directly to the oil storage tank (through a diffuser) in order to control system pressure. There was also a pressure gauge with an isolating valve that drained to the tank. The oil return line was connected to a separate return manifold at the hydraulic power unit, and the oil flowed from the return manifold directly to the tank.

The oil cooler was a water-cooled heat exchanger. A centrifugal pump took oil from the oil storage tank and passed it through the oil cooler, and returned it to the tank. A thermostatically controlled valve regulated the flow of cooling water to the heat

Chapter 4: Test equipment and calibration

exchanger that was provided by a cooling water pump.

c. Dynamic Actuator Control System

A servomechanism incorporating a hydraulic control valve controlled the supply of oil from the hydraulic power unit to the hydraulic ram of the dynamic actuator. The ram capacity of 10t provided simulation of axle loads up to 40t, allowing for the distribution of wheel load on the adjacent sleepers (Jeffs and Tew, 1991). The servomechanism was operated from the console at the testing device and was capable of controlling either applied load (and therefore stress) or ram stroke (and therefore strain), as desired. It also allowed the selection of differing load cycle wave forms, viz. sinusoidal, triangular or square. Load amplitude was dependent upon the selected load cycle frequency. The travel of the hydraulic ram determined also the amplitude of the applied load. The lowest load cycle frequency was 0.01 Hz. At this frequency, the maximum ram travel was 25 mm with a corresponding maximum load of 10t. At a frequency of 25 Hz, however, the ram travel was only 1.7 mm with a corresponding maximum load of 5t. It was possible to set the hydraulic power unit control to terminate loading after a specified number of load cycles. The servomechanism could be connected to a data-logger to provide specific dynamic actuator load cycle input.

d. Instrumentation

Embedded pressure cells measured the variation of the applied vertical pressure with depth. After considering the sharpness of the ballast aggregates that could damage the pressure cells, and also to ensure intimate contact between the coarser grains and the small pressure cells (active area 40 mm diameter), it was decided to place the pressure cells between two rigid steel plates (150x150x6 mm). The area around the pressure cells

was filled with a layer of neoprene foam (3 mm thickness) that provided insignificant resistance to relative movement between the two steel plates. At the beginning of the test program two soil pressure cells (Kulite, 0234) with a maximum capacity of 1200 kPa were used. During the test program these pressure cells were replaced by alternative load cells (Huston and ALD, maximum capacity 133 kN, equivalent to a 6000 kPa pressure over the plate area, with a cell height 15 mm), which were also placed between steel plates. Electrical cables from the load cells were protected by a reinforced PVC cover to resist penetration by the ballast aggregates. The output of the pressure or load cells was monitored by a strain meter box (HBM Digitaler Dehnungsmesser DMD, 20A), which permitted readings on four channels.

4.2.3.4. Confining Pressure Control System a. Ballast Confinement System

The ballast was confined in the testing device by its four walls. Identical single acting hydraulic jacks supported each of the walls. All four hydraulic jacks were of the Enerpac type RC-106. These had a maximum load capacity of 101.7 kN and a stroke of 155 mm, and were selected because of their high force capability and compact size.

Their overall length of 247 mm left a clearance of 150 mm to the ballast confinement walls. The capability of the hydraulic jacks allowed the maintenance of a maximum ballast confining pressure of 200 kPa, well above the pressure range used in the present test program.

By interconnecting the hydraulic supplies of opposing jacks, it was ensured that their output forces were equal and that the jacks operated in opposing pairs. The North and South jacks were interconnected and supplied by a hand pump (Enerpac type P-39)

Chapter 4: Test equipment and calibration

having a rated maximum pressure of 700 bar (1 MPa) and a volumetric displacement of 2.61 × 10-3 litres per stroke. The hand pump was provided with a pressure gauge that enabled the system pressure to be monitored with an accuracy of ± 20 kPa. An electro-hydraulic pump (ASEA), that had a maximum pressure capacity of 35 MPa and that was fitted with a two-way directional control valve, supplied the interconnected East and West jacks. The jack interconnection arrangements allowed for independent ballast confinement and loading in the two directions North/South and East/West. These loads gave rise to the intermediate principal stress, σ2, and the minimum principal stress, σ3, respectively.

b. Instrumentation

The ballast confinement walls were designed to travel up to 100 mm in the EW or NS direction. The jack clearance of 150 mm to the ballast confinement walls allowed 50 - 80 mm to accommodate a load cell to measure the jacking load, and therefore, the confining pressure.

The range of confining pressure (10 – 30 kPa) to be applied on the two orthogonal horizontal directions determined the capacity of the load cells. For the NS direction, two identical load cells (Interface) were used. Their maximum capacity of 44 kN permitted an equivalent of 90 kPa confining pressure on the specimen. For the EW direction two different load cells were used. These load cells (Transducers Inc and Deltacel, types) had a maximum capacity of 3.5 kN, resulting in a 10 kPa confining pressure on the specimen. Hemispherical seatings were provided between the load cells and the box walls. This eliminated the possibility of load cell damage from wall rotation during testing. All four load cells were connected to a second strain meter box (HBM Digitaler

Dehnungsmesser DMD, 20A) where the output of each load cell was displayed on a separate channel.

4.2.3.5 Horizontal and Vertical Displacement Monitoring Devices

It was imperative to precisely monitor the horizontal and vertical deformation of the ballast layer during each test. To do this, a system of eight digimatic calipers and eight dial gauges was used. One measuring device was placed close to each corner of each side of the test apparatus, as shown in Figure 45. The eight Mitutoyo digimatic calipers were fixed on rigid consoles at the corner of the East and West walls and could measure displacements as much as 150 mm. Their accuracy of 0.01 mm corresponded to a 0.0025 % strain in East-West direction. They could be set to zero at any time during the test. The displacement of the North and South walls was similarly monitored by a system of eight Mitutoyo dial gauges with a travel of 50 mm and an accuracy of 0.01 mm. This was equivalent to a 0.003% strain in the North-South direction. The lower dial gauge stands were fixed directly to the bottom plate. For the upper positions, however, stands were fixed on rigid beams fitted close to the position of the dial gauges.

This was carried out because, test vibrations might have dislodged any instrumentation mounted directly to the box walls.

In order to monitor the surface and internal deformation of the ballast layers, it was decided to select a spatial mesh. Measurement of the displacement of the nodes of this mesh was possible by embedding settlement devices at selected points and levels. For ease of monitoring and interpretation of collected data, it was evident that the position of the points on the grid should have common coordinates in plan. It was also clear that the grid dimensions should be selected to provide sufficient distance between the

Chapter 4: Test equipment and calibration

(a)

(b)

Figure 45. Detail of the measurement devices for lateral displacement: (a) digimatic calipers on the East and West walls (elevation); (b) dial gauges on the

North and South walls (plan)

displacement devices, the rigid box boundaries and the timber sleeper segment in order to permit freedom for rotation of the settlement devices. However, three points at the centre of the specimen (along the E-W axis) fell under the sleeper position (Fig. 46). It was therefore necessary to drill 25 mm diameter holes through the timber sleeper to permit the positioning of two settlement devices (e.g. points 18 and 32 on the capping layer surface, and 67 and 81 on the surface of the load-bearing ballast).

The internal displacement of selected points was monitored by reference to the embedded settlement points. Each of these had a stiff base (a 50×50×3 mm steel plate) with a 6 mm steel rod welded to its centre (Fig. 47). The length of the devices placed on top of capping layer of each steel rod was 500 mm. Shorter settlement devices (200 mm) were used to monitor the displacements of the load bearing ballast layer. It was necessary to trim the bases of the shorter settlement rods to fit them close to the longer devices (Fig. 47). Insertion of settlement devices at the grid points near the boundaries proved difficult. Consequently, it was decided that the level of these points on both the capping and load-bearing ballast surface would be measured only at the beginning and at the end of each test.

The same grid was used to monitor the deformation of the crib ballast surface.

However, for the crib ballast, it was possible to monitor the deformation of the grid points at the box boundaries. A set of five dial gauges (Mitutoyo, reading accuracy 0.01 mm), each having a maximum travel of 50 mm, was used to monitor the vertical displacement. These were fitted on a stiff beam supported on the top frame and profiled to position the dial gauges as close as practicable to the reading points, as schematically presented in Figure 48.

Chapter 4: Test equipment and calibration

50 51 52 53 54 55 56

57 58 59 60 61 62 63

67

64 65 66 68 69 70

71 72 73 74 75 76 77

78 79 80 82 83 84

81

85 86 87 88 89 90 91

92 93 94 95 96 97 98

0 100 200 300 400 500 600

East wall

West wal l

0 100 200 300 400 500 600 700 800

South wall North wall

Nomination of the displacement spokes on the top of load bearing ballast

Figure 46. Selected grid for monitoring the internal deformation of the ballast layer (grid under sleeper, top of load bearing ballast)

Figure 47. Settlement points at the top of load bearing ballast

Figure 48. Arrangement for the measurement of vertical deformations

This arrangement of dial gauges proved to be error-prone in the first test, and for subsequent tests, the surface displacement of the selected grid was measured relative to a reference level selected at the beginning of each test. As the grid was identical for all levels considered, this procedure was adopted to monitor the displacement of the settlement points.

4.3 CALIBRATION OF INSTRUMENTATION AND DATA