Compensatory hydraulic uptake of water by tomato due to variable root‐zone salinity

Plant root systems are exposed to spatial and temporal heterogeneity regarding water availability. In the long‐term, compensation, increased uptake by roots in areas with favorable conditions in response to decreased uptake in areas under stress, is driven by root growth and distribution. In the short‐term (hours–days), compensative processes are less understood. We hypothesized hydraulic compensation where local lowered water availability is accompanied by increased uptake from areas where water remains available. Our objective was to quantify instantaneous hydraulic root uptake under conditions of differential water availability. Tomato (Solanum lycopersicum L.) plants were grown in split‐root weighing‐drainage lysimeters in which each half of the roots could alternatively be exposed to short‐term conditions of salinity. Uptake was quantified from each of the two root zone compartments. One‐sided exposure to salinity immediately led to less uptake from the salt‐affected compartment and increased uptake from the nontreated compartment. Compensation occurred at salinity, caused by NaCl solution of 4 dS m−1, that did not decrease uptake in plants with entire root systems exposed. At higher salinity, 6.44 dS m−1, transpiration decreased by ∼50% when the total root system was exposed. When only half of the roots were exposed, total uptake was maintained at levels of nonstressed plants with as much as 85% occurring from the nontreated compartment. The extent of compensation was not absolute and apparently a function of salinity, atmospheric demand, and duration of exposure. As long as there is no hydraulic restriction in other areas, temporary reduction in water availability in some parts of a tomato's root zone will not affect plant‐scale transpiration.


INTRODUCTION
Water plays an important role in plant functioning: in addition to the small number of water molecules needed in photosynthetic and sequestration processes, plants take up and evaporate large volumes of water (transpiration). Water move-ment during the day from the soil to roots and from roots to transpiring organs is a physical process functioning passively, but which is dependent on soil and plant hydraulic conductivity. Plants have physiological mechanisms allowing regulation of transpiration by governing the opening of transpiring elements and control over hydraulic conductivity in the roots (Steudle, 2000).
Spatial variability of water content and solution ion concentration in the root zone of most agricultural soils exists due to spatial variability in the physical and chemical properties of the soils themselves. Additionally, there is temporal variation in water availability in the different regions of the root zone due to nonuniform patterns of irrigation, evaporation, transpiration, and more. In contrast to the spatial variability of the soil, which is often relatively stable and only slowly, if at all, changing, the temporal variability is likely to occur quickly and often. Such changes are likely dynamic and irregular. In order to deal with these quick and temporary changes, a plant needs methods for regulating uptake between different parts of its root zone, such as to temporarily augment uptake from certain areas and simultaneously reduce uptake from others.
Plant responses to varied conditions for water uptake can be divided into short-(hours) and long (days and longer)term mechanisms (Lobet et al., 2014;Munns, 2002;Munns et al., 2000). In the long-term, root growth is diverged into zones with superior conditions for survival and uptake. Root density and number of young roots will be greatest in zones with an optimal balance of available water and nutrients, sufficient oxygen, and minimal harmful material such as detrimental salts (Bazihizina et al., 2012;Munns, 2002). Shortterm responses, potentially enabling compensated uptake without changes in root distribution, are less understood. Possible drivers and explanations for short-term changes in uptake patterns are simple hydrology based on the movement towards equilibrium of potential energy gradients, regulation of hydraulic conductivity within the roots as water moves from the root surface to the xylem via cell membranes, and root-mediated hydraulic redistribution (Bazihizina et al., 2012;Cai et al., 2018;Lei et al., 2019;Steudle, 2000;Thomas et al., 2020). Differential changes in hydraulic conductivity in different parts of the root zone of a single plant may improve plant scale efficiency of water uptake by allowing more uptake from regions with higher water potential where water is more readily available.
Stress-induced signaling by roots-to-shoots may complicate the mechanisms of uptake from regions differing in their availability of water. Some studies suggest that partial root zone stress-causing conditions trigger signaling with abscisic acid or other hormones, which cause stomatal closure and reduction of transpiration, and therefore, uptake (Kang & Zhang, 2004). If this is the case, compensation would be a function of, and accompanied by, less total water uptake, potentially without a cost to plant carbon sequestration (Sepaskhah & Ahmadi, 2012).
There are two strategies for modeling plant root water uptake. The first approach, microscopic, solves for water flow according to physical principles of potential gradients and conductivities. Because parameters for solving flow at this scale are unique in every part of the root zone and for each part of each root where uptake occurs, this approach is rarely used and not practical except for very small scale modeling, such as for parts of single roots (Jarvis, 2011;Schröder et al., 2014). One of the major challenges in the microscopic models

Core Ideas
• Instantaneous compensated water uptake was studied in split-rooted tomato lysimeters. • Short-term (hours) exposure to varied-strength salinity was applied in one or both compartments. • Uptake decreased in soil from area under stress and increased from area without. • Immediate compensation, no plant-scale effect and up to 85% uptake from non-stressed compartment were measured. • Compensation was not absolute; being an apparent function of salinity, atmospheric demand, and stress duration.
is to determine osmotic potential at root surfaces where salts accumulate. The second approach, macroscopic, is based on averaging conditions of the root zone soil over space and time. This is the strategy used in most popular hydrological-plant coupled models simulating water and solute flow and transport including uptake. These models depend on some level of empirical calibration. For example, while water flow to and into roots can be described using physical principles of matric potential gradients and conductivities, the influence of salinity is quantified by a reduction function based on empirical evidence in these models instead of on osmotic potential. In the cases of the macroscopic uptake models, compensation is often handled with a mathematical fudge factor with no consideration of driving mechanisms (Albasha et al., 2015;Jarvis, 2011;Oster et al., 2012;Šimůnek & Hopmans, 2009). There are more advanced macroscopic uptake models in which compensation is a function of the physics of root architecture and the hydraulics of soil and roots (Couvreur et al., 2012;Javaux et al., 2013;Schröder et al., 2014). Typically, these models are challenging to parameterize and run and, similar to microscopic models, their use is limited to complicated single root to single (usually virtual) plant simulations. Studies of such models rarely include actual measured data and lack robustness regarding calibration and validation. We are interested in advancing understanding of root water uptake as a function of the variable conditions within the root zone of a single plant. The driving hypothesis in this work is that, when water availability is equal in all parts of a plant's root zone, uptake will be equal from each part, and when water availability is not equal in all parts of a plant's root zone, uptake will vary accordingly. We suggest that, as long as there is no hydraulic limitation, water uptake from other regions of the root zone will increase at the expense of decreased uptake in zones where water is less available, such that total water needs of the plant and its functioning are not compromised.

Vadose Zone Journal
F I G U R E 1 Experimental setup. (a) Schematic view of dual sets of pots. Upper pot filled with growing medium for root growth, lower pots for drainage collection including valves for automatic emptying. (b,c) Initial separation of roots and main stem into two equal sections. (d,e) Secondary separation of roots and stems. (f) Planting into inner pots of lysimeters. (g) View of one table of split root lysimeters in the greenhouse including weighing platforms and system for drainage collection and monitoring. (h) View of lysimeters from above (with surface mulching temporarily removed) to visualize the split root tomatoes, irrigation system, and trellising devices The main objective of the research was to quantify instantaneous compensated water uptake by roots. We specifically set out to evaluate the following: (a) water uptake from separate parts of a root zone when water availability is equal, (b) the influence of decreased water availability in part of the root zone on the uptake in each part at different time scales, (c) the influence of decreased water availability in part of the root zone on whole-plant transpiration and, (d) the compensated uptake mechanisms.

MATERIALS AND METHODS
The short-term compensation response to salinity of tomato (Solanum lycopersicum L., cultivar M82) as a model plant was investigated in a novel experimental setup. Individual plants were grown with two differentiated root systems (split-root) that were planted in separate pots. An automated irrigation system allowed application of water containing different concentrations of salts while maintaining control of irrigation amount, concentrations in solution, and timing of irrigation events. Water uptake from each part of the split-root zone was measured using a novel lysimeter system continuously and exactly. Experiments were conducted involving short-term application of salinity during two cycles of tomato plant growth in a greenhouse at the Gilat Research Center (40˚31ʹ N, 40˚34ʹ E, 150 m asl).

The split-root lysimeter system
Dual lysimeters were prepared in which individual plants had half their roots in two separate weighing-drainage units ( Figure 1). Each side consisted of an inner 4 L pot filled with coarse sand growing medium (Negev Minerals 20-30, size distribution 0.595-0.841 mm) and an outer 10 L pot for collection of drainage water, a drainage pipe and valve to empty the drainage water from the outer pot, and a weighing platform with load cell (Vishay 1042-10k-C3) with continuous monitoring of the mass of the two pots. Each load cell could weigh up to 15 kg with 1.5 g resolution accuracy.
The bottom of the inside of the outer pot was filled with a layer of impenetrable sealant that sloped such that the exit hole for drainage was located at the lowest point, thus disallowing standing drainage water and ensuring that the drainage was fully emptied and collected when the valves were open. Drainage water leaving the outer pot was directed through a funnel into a secondary collection container from which it could be sampled and disposed. Water in the soil at field capacity was 13-14% by weight. The pots were filled at den-sity of 1.7 kg L −1 , meaning that each pot had approximately 950 ml solution at field capacity.

Preparation of split-rooted tomato seedlings
Seedlings were provided by a commercial nursery (Hishtil, Ashkelon, Israel) at 4 wk after seeding in 17-ml trays filled with potting medium (Figure 1b). The roots of each seedling were then dipped in water to partially separate roots from the medium. The main root from each seedling was cut and the remaining roots separated into two equal portions laterally with each connected to the main stem, which was cut from the bottom to around 2 cm (Figure 1c). Each side of the roots was replanted into potting soil in 50-ml plastic pots ( Figure 1d). The seedlings were then placed in a climatecontrolled growth chamber under optimal light and temperature conditions. After 2 wk, the main stem of each seedling was cut perpendicularly above the roots, a further 4 cm, and the roots with the potting soil transplanted into the inner pots of the lysimeters (Figure 1e,f). The soil surface of each inner pot was covered with plastic mulching to reduce surface evaporation to an assumed negligible level.

Irrigation
Except for during the treatment application periods, the inner pots were consistently irrigated with high quality (low salinity) nutrient solution. During treatment application periods, one or both sides of split-rooted plants received high salinity nutrient solution. The two solutions were prepared and stored in separate 100-L containers. After transplanting, each solution included N-P-K at 6-6-6 giving 70 mg N-NO 3 L −1 plus magnesium (Mg) (1% w/v) and micronutrients given as Shefer fertilizer (Deshanim). After 2 wk, the solutions were switched to N-P-K at 4-2.5-6 enriched with calcium (Ca) (1% w/v) and micronutrients as Mor fertilizer (Deshanim) given at a rate of 80 mg N-NO 3 L −1 , 120 mg K L −1 , 22 mg P L −1 , and 80 mg Ca L −1 . Salinity was added to the high salt irrigation solution as NaCl at rates that varied during the experiments. Each container was connected to the pots via a centrifugal electric pump (Pedrollo). Pressure of the irrigating solution was regulated by a return pipe circulating back into the solution container, thus allowing a high flow rate from the pump. Irrigation rates were monitored and controlled by liquid chemical meters with 0.1-L resolution (Arad Group). The irrigation system was controlled and automated (S.A.S. Equipment R&D).
Pipes brought the solutions from each container to each side of the table holding the lysimeters. Drippers (4 L h −1 pressure compensated) (Netafim) were connected to the pipes accord-ing to treatments. Each dripper was split via 2.5-mm tubing into four exits per pot to ensure full and even wetting of the soil surface. Treatments where one quality irrigation water was switched to the other were accomplished by disconnecting and connecting the 2.5-mm tubing connections from the drippers on the appropriate pipes without disturbing the distribution of water in the pots.
Plants were irrigated several times throughout the day. One large irrigation event of 500 ml per container was applied before dawn and two to four events of 300 ml each during the day. The number of events was determined in order to give irrigation in excess and guarantee drainage, and therefore, increased as the tomato plants grew and according to environmental demand in the greenhouse.

Experiments
A total of 16 split-root lysimeters were place on two tables. Each table had additional nonweighing split-root plants on each of its ends to act as buffers and reduce boundary effects. Each treatment was placed randomly two times on each table.
A summary of the nomenclature used to define the lysimeters and their separate pots during treatment application is given in Table 1. The two sides of each plant's root zone (compartments) were irrigated equally with good quality (low salinity) nutrient solution except for during treatment application periods. Treatment application was limited to given days with treatments replicated four times. Treatments included control with no added salinity on either compartment of the split-root plants and homogeneous salinization (HS): high salinity given to both compartments of the split-root lysimeters. In order to evaluate accumulated effects and to indicate any possible carry-over effects of exposure to salinity, two versions of partial salinization were evaluated: the first was constant partial salinization (CPS) with the salinity always on the same side during each treatment application period, and the second was alternating partial salinization (APS) with the salinity alternating between the compartments. The concentration of salts and length of exposure to them were altered during different treatment application periods. Treatments were applied by leaching with the solutions (high or low salinity) in the large irrigation event the night prior and irrigating with the same solutions during a full day. In the evening following treatment application days, an additional large leaching irrigation was given with low salinity water.
The experiment for growth cycle I was short-term exposure to salinity with differing salt levels (4 rounds). This cycle was conducted in October and November 2018. Treatments with salinity lasted a single day. Treatments included control, HS, APS, and CPS. The low salinity solution had electrical conductivity (EC) of 1.06, 0.85, 0.80 and 0.97 dS m −1 , respec-T A B L E 1 Nomenclature used in defining treatments

Term Explanation
Treatment application Days when salinity was given to one or both compartments of the split-root tomatoes tively, in treatment days. The high salinity solution was 3.03 dS m −1 in the first treatment day, 3.99 dS m −1 in the second, 3.47 dS m −1 in the third, and 6.15 dS m −1 in the final treatment day. Each treatment day was followed by 1 wk without treatments with irrigation applied to all compartments using only low salinity solution.
The experiment for growth cycle II was longer-term, consecutive days of exposure to salinity. This cycle was conducted in May and June 2019. Salinity treatment was with solution of 6.44 dS m −1 while low salinity had EC of 1.09 dS m −1 . In this cycle, two periods of treatment application days were conducted. The first lasted two consecutive days and the second period lasted seven consecutive days. In the first period, no leaching of salts occurred between the two days. In the second period, leaching with low salinity water was practiced between each of the seven treatment days. In growth cycle II treatments, control, HS, and APS were evaluated.

Calculating evapotranspiration
Daily water balance was used to calculate evapotranspiration or uptake by roots from each pot, as follows: Where I is daily amount of irrigation per pot, D is drainage amount determined by change in mass following drainage emptying events occurring during nighttime hours, and ΔW is change of mass in a pot over 24 h, representing change in storage of water in the soil. Irrigation amount per event per pot was measured by change in mass during an irrigation event given in nighttime hours. In order for the change in lysimeter mass to accurately quantify and portray root water uptake and transpiration, drainage emptying was conducted during the night and not during irrigation. An illustration of calculation of Equation 1 is shown in Figure 2A. Two sides of a single lysimeter are shown on a day when one side received good quality solution and the other high salinity solution.
F I G U R E 2 (A) Changes in mass in two sides of a single split-root lysimeter during a trial day (31 Oct. 2018). (B) Accumulated uptake. Lower case letters indicate critical time points within the daily irrigation cycle during treatment periods: a, prior to predawn irrigation; b, after the end of predawn irrigation; c, after predawn drainage emptying; d, end of day, prior to drainage emptying; e, end of day after drainage emptying; and f, nighttime leaching irrigations. Blue lines are side without salt (electrical conductivity = 0.97 dS m −1 ) and red lines are side with salt added (6.15 dS m −1 ) In addition to the daily uptake, instantaneous uptake in each root zone section is the change in load cell recorded mass as a function of time ( Figure 2B). For the example in Figure 2, the differences in uptake seen between irrigation events during daylight hours in Figure 2A are reflected by different accumulated uptake.

Additional measurements
Solution and drainage water EC was measured using an EC meter (Eutech CON 700). Analyses of irrigation and drainage solutions was done for potassium (K) as an indicator for nutrient uptake. At the end of each growing cycle, plants were F I G U R E 3 Average water uptake per compartment per day over the course of growth cycle I. Letters in the table below show significant differences between treatments on treatment application days. The value of probability (F) is shown on days when significant differences were found. Midday vapor pressure deficit (VPD) (MPa) averaged between 1200 h and 1500 h, is shown for each day during the cycle. Electrical conductivity (EC) (dS m −1 ) of irrigation solution for treatment days is also indicated. APS, alternating partial salinization; CPS, constant partial salinization; HS, homogenous salinization harvested. Diagnostic (youngest fully developed) leaves were sampled, rinsed in distilled water, dried at 70˚C and analyzed for nitrogen (N), phosphorus (P), potassium (K), Ca, Mg, sodium (Na), and chlorine (Cl). Macro elements in plant tissue were evaluated after extraction in sulfuric acid and peroxide. Nitrogen concentration was determined in an autoanalyzer (Lachat), concentration of Na and K were determined with an atomic absorption spectrophotometer (Perkin-Elmer 460). Chloride was determined in a chloride analyzer (model 926, Sherwood). Stem water potential was determined midday (between 1100 and 1300 h) during the experimental period using a pressure chamber (ARIMAD, MRC) after being covered by a sealable foil-lined plastic bag for 2 h according to Taiz and Zeiger (2006).

Statistics and data processing
The results were analyzed with one-way ANOVA in JMP 14 statistical software (SAS Institute Inc.) and evaluated for significant differences using Student's t post hoc test at α = .05. Several processes were used to improve comparisons between amount of water taken up per pot and treatment in order to normalize for environmental conditions and differences in plant size. Data (daily uptake per pot) were normalized to that of a prior reference day (rd).
Where ΔS is change in daily uptake of a single pot, S rd is uptake on a given reference day, S d is uptake on day of interest, ΔS %Normalized is percentage change in daily uptake compared with control, S crd is uptake of control without salinity on reference day, S cd is uptake of control on day of interest. Reference days were chosen as the closest prior days without treatments applied.

Growth cycle I: Once-weekly exposure of single-day salinity at differing rates
Average daily uptake from each compartment of the split-root tomato plant root-zones is given in Figure 3 for growth cycle I. The only significant differences in container-scale uptake occurred on the treatment application days for both APS and CPS, where one compartment received salty irrigation water and the other compartment continued to receive water without salt.
When conditions were equal, uptake from either side of the split-root plants was equal. This was true both for all the nontreatment days when no salinity was given and for control and HS treatments receiving the same water quality in both sides of the split-root, whether with or without added salt. Differ-Vadose Zone Journal F I G U R E 4 Whole plant transpiration (uptake) during growth cycle I. No significant differences were found between the treatments. Midday vapor pressure deficit (VPD) (MPa) is shown for each day during the cycle and electrical conductivity (EC) (dS m −1 ) of irrigation solution for treatment days is also indicated. APS, alternating partial salinization; CPS, constant partial salinization; HS, homogenous salinization ences in water uptake between the sides was never greater than 15%, and the differences were not significant ( Figure 3).
Addition of salt to both sides of split-rooted tomato plant root systems (HS treatment) did not significantly affect absolute rates of uptake in either compartment (Figure 3), and therefore, did not decrease whole-plant-scale uptake and transpiration ( Figure 4).
Salinity added to one compartment of the split-root pots had no significant effect on the first treatment day when irrigation water EC was ∼3 dS m −1 , but significantly decreased uptake in the subsequent treatment days as salinity level increased. These decreases were accompanied by increased uptake in the nonsalt compartments of the same plants in the third and fourth treatment application days. The decreases and increases were limited to the days with treatments themselves. The extent of the compensative behavior increased with each subsequent treatment day and was largest on the fourth day, when irrigation water EC was just over 6 dS m −1 .
Whole plant uptake (transpiration) (Figure 4) was not affected by the salinity given on treatment days for either HS or for the one-sided salinity treatments APS and CPS. Whole plant transpiration increased from around 500 g d −1 on the first treatment day to ∼1,400 g d −1 on the fourth treatment day. The increase in transpiration was due mostly to plant growth but was also influenced by environmental conditions as reflected in the vapor pressure deficit in the greenhouse.
Normalizing of the daily compartment-scale uptake to each compartment's uptake the day before treatment days and to the no-salt control reveals significant differences due to treatments on the second through fourth treatment days ( Figure 5). On the second treatment day, 16 Oct. 2018, both the constant and alternating compartments receiving salts decreased uptake by ∼45%. The other, nonsalinated compartments of the plant root zone increased uptake by 20 and 30% for APS and CPS, respectively, with no significant difference between them. The third treatment day saw greater compensation, with a similar ∼42% decrease in the salt-added sides and 39% F I G U R E 5 Cycle I. Percentage change in compartment-scale uptake relative to uptake on the day before and to the no-salt control (Equation 3). Letters indicate significant differences between treatments on given days. Electrical conductivity of nonsalty and salty water was 0.85 and 3.99 dS m −1 , respectively, on 17 Oct. 2018, 0.80 and 3.47 dS m −1 , respectively, on 24 Oct. 2018, and 0.97 and 6.15 dS m −1 , respectively, on 31 Oct. 2018. APS, alternating partial salinization; CPS, constant partial salinization; HS, homogenous salinization increase in the no-salt compartment of APS plants and 58% increase in the CPS compartment without salt. Again, the differences between the compartments either receiving or not receiving irrigation with salt were not significant. On the fourth treatment day, when irrigation water salinity in the salt treatments increased to just over 6 dS m −1 , compensation was characterized by almost equal decrease in the with-salt compartments and increase in the without-salt compartments of around 52 and 49%, respectively, with no differences between the alternating and constant treatments. The uptake from both compartments of the HS treatment was decreased at a significantly smaller level compared with the one-sided salt-added compartments, by 13% on treatment day 2, 3% (nonsignificant from control) on treatment day 3, and 12% on day 4.

F I G U R E 6
Average water uptake per compartment per day over the course of growth cycle II. Letters in the table below show significant differences between treatments on treatment days. Significance (F) values are shown for each day's measurements. Midday vapor pressure deficit (VPD) (MPa) is shown for each day during the cycle and electrical conductivity (EC) (dS m −1 ) of irrigation solution for treatment application days is also indicated. APS, alternating partial salinization; HS, homogenous salinization

Growth cycle II: Multiple-day exposure to salinity
Multiple days of exposure in treatments during the second growth cycle resulted in significant differences in compartment-scale uptake receiving irrigation water with EC of 6.44 dS m −1 (Figure 6). The first treatment application period lasted two days without leaching in between them on 13-14 May 2019. On these days, the HS treatment with salt in both sides led to decreased uptake from both compartments. The one-sided APS treatment demonstrated a significant decrease in uptake from the compartment with salt and an equivalent increase in uptake from the compartment without salt. During the 12 d following the initial two trial day period, there appeared to be some carry-over effect of the salinity with the HS and APS compartments having lower (sometimes significantly) uptake compared with the compartments that had not been exposed to salinity. During the period of treatment application lasting 7 d with leaching of salts every night between them, the HS treatment had consistently lower uptake compared with the control. The APS reduced to significantly lower uptake levels on the first day and reached minimum levels after 2 d that were maintained throughout the week of the treatments. The without-salt compartments of these treatments showed increased uptake over time with the most dramatic increase in the first days. The uptake from these nonsalinated compartments was significantly higher than the control throughout the week. For 2 d following this treatment week, the APS without-salt compartments continued to have higher uptake, and APS with-salt compartments had lower uptake compared with both HS and control treatments, which were not significantly different one from another.
Whole plant-scale transpiration (Figure 7) was reduced during the treatment days in the HS treatment on the second day of the initial, 2-d period, and during the second, week-long period, but not between treatment periods. The plant-scale uptake from the treatments APS receiving salt in one compartment only did not fall from that of the control.
Normalized compartment-scale uptake of water, referenced to the day prior to each of the treatment periods (Figure 8), showed significant differences when salt was added to both sides, reducing uptake by 26% on the first day and 44% on the second day of the first period. On these days, uptake in salty compartments of split treatments was reduced 46 and 60% and increased in non-salt sides by 28 and 38%, respectively. On the first day following the 2-d treatment, uptake continued to be 20-25% significantly lower in all compartments that had received salts and 14% significantly higher in nonsalinized compartments compared with the control without salt. In the rest of the intermediate days between treatment periods in cycle II, there were no significant differences in uptake between the compartments in the various treatments in spite of the averages showing continued trends. In the second treatment period, uptake in the control with-salt compartments was significantly lower by 8% after 1 d, 17% after 2 d, and then consistently by 20-22% the following days compared with the control. The first day of split treatments found 53% decrease in salt-added compartments and 62% increase in compartments without salts added. These differences increased to 80-85% decrease in the salt-added side and 75-85% increase in the nonsalty compartments. Significant differences along the same trends continued for 5 d after the treatments ended (Figure 8). On these days following the treatments, the HS compartments that had been irrigated with salty water continued to uptake at rates 12-22% lower than the con-F I G U R E 7 Whole plant transpiration (uptake) for growth cycle II. Letters in the table below show significant differences between treatments on treatment days. Significance (F) values are shown for each day's measurements. Midday vapor pressure deficit (VPD) (MPa) is shown for each day during the cycle and electrical conductivity (EC) (dS m −1 ) of irrigation solution for treatment days is also indicated. APS, alternating partial salinization; HS, homogenous salinization F I G U R E 8 Cycle II. Percentage change in compartment-scale uptake relative to uptake on the day before and to the no-salt control (Equation 3). Letters in the table below show significant differences between treatments on given days. Significance (F) values are shown for each day's measurements. Midday vapor pressure deficit (VPD) (MPa) is shown for each day during the cycle and electrical conductivity (EC) (dS m −1 ) of irrigation solution for treatment days is also indicated trol without salt. The relative reduction in compartments that had received salts previously decreased in the days following the treatments such that, relative to the control, uptake was 60% lower on the first day, 36% lower on the second, and 12-34% thereafter. The compartments that had not received salt showed gradually lower relative uptake 60% after 1 d, 30% after 2 d, and 16−13% after 3-5 d.
The salinity treatments reduced stem water potential (Figure 9). Stem water potential was significantly lower on 2 d during the second, week-long treatment period, in cycle II, and on the day following treatments in the HS treatment compared with control without-salt plants. The stem water potential of plants receiving salt in half of the root zone was intermediate. Stem water potential was between −0.4 and −0.5 MPa for the control plants, −0.65 to −0.75 MPa for HS plants, and −0.5 to −0.6 MPa for the APS plants receiving salt in half the root zone.
No differences were found for minerals N, P, or Mg in leaves at the end of growth cycle II (Table 2) between treatments. Differences were found for Na, Cl, Ca, and K in leaves. For Na and Cl, concentrations in leaves were highest for the control with-salt plants and lowest for the control without salt, and plants receiving salt in one side of the root zone had intermediate concentrations. For K, the control without salt had F I G U R E 9 Stem water potential cycle II. Letters indicate significant differences between treatments on given days. n = 4, α = .05, p < .05. APS, alternating partial salinization; HS, homogenous salinization F I G U R E 1 0 Concentration of potassium in irrigation and drainage water. Second growing cycle on experimental application days and after. Shading indicates experimental days. n = 4. Letters indicate significant differences at α = .05. P < .001. APS, alternating partial salinization; HS, homogenous salinization the highest concentration and the control with-salt plants had the lowest with both constant and alternating one-sided salinity having equivalent intermediate concentrations. The pattern of Ca, with higher concentration in plants exposed to increased NaCl, is contrary to what may be expected in plants under longer-term exposure to salinity (Rengel, 1992), indicating the uniqueness of the short-term salinity exposure in this study.
Uptake of K can be inferred from concentrations of K in drainage water of each pot during the second trial of the second growth cycle (Figure 10). Drainage K ranged from ∼20-40 mg L −1 in pots of constant without salt and from 70 to >100 mg L −1 in control and split treatments with salt. The control without-salt treatment had drainage K concentrations of ∼50 mg L −1 . Irrigation water had ∼120 mg K L −1 . This indicates significantly greater uptake of K when no salt was added to the irrigation water and decreased K uptake in salinized root zones. Low concentration in the drainage indicates active uptake.

Compensative water uptake by tomato roots
In our split-root lysimeters, uptake from the two sides of the root system was found to be close to equal for cases where water availability, based on rhizosphere salinity, was not different between the two sides (Figures 3, 5, 6, 8). Relative differences between two halves of tomato root systems with equal conditions never were greater than 15%. This was true whether salinity was low and water available was, therefore, high or salinity was high in both compartments.
On treatment application days, in response to changes in salinity in half of the root zone, uptake was immediately reduced in the salinized part and increased in the nonsalinized part (Figures 3, 5, 6, 8). When added salt concentration was relatively low, compensation was quantified up to tens of percentage increase in nonsalty and equivalent decrease in salty compartments, even in cases where uptake was not affected in control plants where both compartments received the salt solution (Figures 3, 5). When salinity was higher, plant-scale transpiration was reduced by exposure to salinity on one side, but much less than for cases where both sides were exposed. These plant-scale reductions in one-sided salinized root zones lasted only a few days and eventually returned to be equal to nonsalinized control plants. The compensation occurring as a result of a day or two of salinity exposure to one side of the root system was reversible and nonmeasurable already after ∼1 d of return to nonsaline conditions. Some carry-over effect and continuation of a small amount of preferential uptake was T A B L E 2 Concentration of minerals in diagnostic leaves sampled at the end of growing cycle II Note. HS, homogeneous salinization; APS, alternating partial salinization. Letters indicate significant differences between treatments; n = 4, α = .05. found in cases of high level of salinity lasting for more than a day (Figures 6 and 8) but this also was temporary and not measurable 3-5 d after salinity exposure ended. The mechanisms for such carry-over response to salinity may be connected to root hair growth and root level hydraulic adaption that could occur even at the short time scales (Munns et al., 2000). Tomato plants were found to have a mechanism for shortterm compensative response to spatially variable conditions of available water in the root zone. This response occurs immediately and is not connected to long-term morphological or anatomical response mechanisms. This supports the argument that, when plants have a relatively highly available water source for uptake, most of their water needs are acquired from that source or part of the root system. We clearly documented short-term compensation indicating that, when there exist zones of more available water, there will be increased uptake in those zones and corresponding decreased uptake in zones having lower available water, and that this is not dependent on root growth. According to this principle, as long as there is no hydraulic restriction for uptake in other areas of the root zone, temporary reduction in water availability in some parts of the root zone will not affect plant scale transpiration or plant activity.

Concentration
In spite of this, the short-term compensation was substantial but not absolute and not completely consistent. Some uptake continued in salinity-affected compartments of the split-root tomatoes and, at high enough salinity, total plantscale transpiration was reduced. We suggest that extent of compensation is ultimately a function of salinity level, environmental demand for water (potential evapotranspiration), and length of the exposure to salinity.
There have been previous studies of effects of salinity in split-rooted systems in various crops including tomatoes (Flores et al., 2002;Sonneveld & De Kreij, 1999;Sonneveld & Voogt, 1990), cucumbers (Cucumis sativus L.) (Sonneveld & De Kreij, 1999), cotton (Gossypium hirsutum L.) (Kong et al., 2012), and grapevines (Vitis vinifera L.) (Shani et al., 1993). In all of them, salinity conditions lasted long enough to allow root growth and root distribution to be dominant in the mechanisms found for response and compensation. In these studies, similar to ours, uptake from separate parts of the root system was equal as long as water availability was equal, whether the conditions were saline or not. Other results of interest from these long-term salinity split-root studies were that yield was a function of the presence of optimal salinity, whether in one or both sides of the root system, and that quality of fruit and mineral accumulation were often a function of average salinity (Sonneveld & De Kreij, 1999;Sonneveld & Voogt, 1990). Additionally, nutrient uptake was retarded from high salinity zones when the other side was low salinity but not when both sides had high salinity (Flores et al., 2002). McLean et al, 2011(McLean et al., 2011) examined responses of Melaleca argentea W. Fitzg., a tree that in its natural habitat has roots in both soil and free water. Draining water from aquatic environment roots had no effect on stomatal conductance as long as soil moisture for the other roots remained high (80% field capacity). The soil born roots increased their hydraulic conductance threefold within 24 h and altered their aquaporin expression when the aquatic roots were not active. This suggests that compensation mechanisms are related to a plant's capability to adjust hydraulic conductance. Our results regarding similar changes in uptake when water availability was reduced due to salinity suggests analogous evolutionary adaptation of crop plants to temporal heterogeneous changes in root system water availability.

Modeling
Compensation is approached in macroscopic models alternatively empirically or via root architecture coupled with soil-root hydraulics. While the current study may suggest and direct improvement of these approaches, we remain cautious because our conclusions regarding extent of compensation and the quantification of compensatory response mechanisms may be limited to tomatoes. Our study supports some modeled results from studies evaluating methods for regarding compensation in mathematical simulations. Jarvis (2011) reported that the degree of simulated compensation was related to soil capillarity and the ratio of total effective root length to potential transpiration. Potential transpiration, in turn, depends on atmospheric conditions and aboveground plant morphological and physiological factors (height, leaf area, nonstressed stomatal conductance). Couvreur et al. (2012) modeled root water uptake based on a macroscopic-hydraulic architecture approach and found in their simulations that compensation was initiated prior to plant scale reductions in transpiration under conditions of unequally distributed (with roots) soil water potential. They also found that compensatory root water uptake precedes the moment where transpiration is affected.

Partial root zone drying
In partial root zone drying irrigation (PRDI), half of the root zone is alternatively irrigated (Adu et al., 2018;Sepaskhah & Ahmadi, 2012). The theory behind PRDI is that if half of a plant's roots are in drying soil and the other half is in irrigated soil, deficit irrigation can be optimized (Kang & Zhang, 2004). The concept further suggests that ABA or other signaling from the drying or dry side in PRDI will drive stomatal closure while the wet side supplies water for transpiration and turgor. Partial root zone drying irrigation has been suggested in this way to allow greater water use efficiency (less transpiration for the same or greater production) particularly as it can maintain reproductive production even when vegetative production is reduced due to deficit water status. The novel experimental setup of the current study using soil water salinity as the driver for fast reversible changes in soil water potential may allow a unique view regarding phenomena occurring during partial wetting and drying. Our results support critics of PRDI (Adu et al., 2019;Adu et al., 2018) in that we found no evidence for canopy-level physiological responses leading to stomatal closure due to signaling from the stressed condition side of the root system. If such signaling occurred, we would expect lowered plant-scale transpiration and different plant-scale water status as measured by stem water potential during the trial periods when plants were exposed to partial rootzone salinity. A number of studies on PRDI (Einhorn et al., 2012;Puértolas et al., 2014) have also suggested that the technique does not elicit (partial) stomatal closure. McLean et al. (2011) also reported that partial stomatal closure was not consistently observed under conditions of partial root drying and hypothesized that measured responses reflect adaptations to native environment where plants subjected to natural frequent changes in root system water availability display higher root level compensation.

Comparing water and salinity stress
Most studies of compensation have been conducted regarding water availability in the root system. We chose to evaluate in terms of salinity, believing that osmotic potential could be better controlled and that long-term elastic responses of roots is less likely than with changes in water availability. We understand that while physically, matric and osmotic stresses may be expected to behave similarly, each decreasing the potential of the water in the soil and decreasing the potential gradient between soil and roots, a plant's biological response to them could be different (Munns, 2002). Responses to salinity by roots could include toxicity at and near the root surfaces and expenditure of energy to block or contain salt ions and prevent their transport into more sensitive tissues, even potentially at the short time scale of exposure maintained in our experiments. Nevertheless, the fact that the treated zone recovered completely suggests either no such effect, or that it was reversible, at least within the time period of the study.

4.5
Compensation even at salinity levels that do not reduce uptake when the entire root system is exposed The phenomenon that compensation occurred significantly, even when the salinity level driving it did not cause reduction in uptake when the entire root system was exposed to the same level (Figures 3-5), is likely this study's most remarkable finding. The fact that moderately saline water decreases root water uptake only locally, and only when applied locally, has important implications to understanding and modeling of multidimensional root zone processes. One explanation is that there is an equilibration of water potential across the root system, even in split-root systems, and that as we have already discussed, the driver is purely the system's hydraulics. This and alternative explanations including root-to-shoot signaling or regulation of aquaporine expression deserve additional thought and study.

CONCLUSIONS
We used a novel experimental setup consisting of weighingdrainage split-root lysimeters and applied short-term exposure to unique parts of the root zone of single tomato plants in order to investigate and quantify compensative uptake of water. Short-term conditions of salinity in half of split-rooted tomatoes growing in a soil medium were found to cause compensated uptake of water such that significantly more water was taken up in less-stressed compared with more-stressed areas. Compensative uptake began immediately upon exposure to salinity. Compensation occurred even at salinity levels that did not decrease uptake in plants when the entire root system was exposed. Even at high salinity levels of 6.44 dS m −1 in irrigation water, causing decreased relative transpiration of ∼50% when the total root system was exposed, total plant uptake and transpiration were maintained at levels of nonstressed plants with as much as 85% of the plant-scale uptake occurring from the nonsalty part of the root system. Extent of compensation was not absolute and apparently a function of salinity level, environmental demand for water, and length of the exposure to salinity. This work on tomatoes suggests that, as long as there is no hydraulic restriction for uptake in other areas of the root zone, temporal reduction in water availability in some parts of the root zone will not affect plant scale transpiration or plant activity.