Journal: Frontiers in physiology
Airway hyperresponsiveness (AHR) and airway inflammation are key pathophysiological features of asthma. Bronchial provocation tests (BPTs) are objective tests for AHR that are clinically useful to aid in the diagnosis of asthma in both adults and children. BPTs can be either “direct” or “indirect,” referring to the mechanism by which a stimulus mediates bronchoconstriction. Direct BPTs refer to the administration of pharmacological agonist (e.g., methacholine or histamine) that act on specific receptors on the airway smooth muscle. Airway inflammation and/or airway remodeling may be key determinants of the response to direct stimuli. Indirect BPTs are those in which the stimulus causes the release of mediators of bronchoconstriction from inflammatory cells (e.g., exercise, allergen, mannitol). Airway sensitivity to indirect stimuli is dependent upon the presence of inflammation (e.g., mast cells, eosinophils), which responds to treatment with inhaled corticosteroids (ICS). Thus, there is a stronger relationship between indices of steroid-sensitive inflammation (e.g., sputum eosinophils, fraction of exhaled nitric oxide) and airway sensitivity to indirect compared to direct stimuli. Regular treatment with ICS does not result in the complete inhibition of responsiveness to direct stimuli. AHR to indirect stimuli identifies individuals that are highly likely to have a clinical improvement with ICS therapy in association with an inhibition of airway sensitivity following weeks to months of treatment with ICS. To comprehend the clinical utility of direct or indirect stimuli in either diagnosis of asthma or monitoring of therapeutic intervention requires an understanding of the underlying pathophysiology of AHR and mechanisms of action of both stimuli.
Toothed whales and bats have independently evolved biosonar systems to navigate and locate and catch prey. Such active sensing allows them to operate in darkness, but with the potential cost of warning prey by the emission of intense ultrasonic signals. At least six orders of nocturnal insects have independently evolved ears sensitive to ultrasound and exhibit evasive maneuvers when exposed to bat calls. Among aquatic prey on the other hand, the ability to detect and avoid ultrasound emitting predators seems to be limited to only one subfamily of Clupeidae: the Alosinae (shad and menhaden). These differences are likely rooted in the different physical properties of air and water where cuticular mechanoreceptors have been adapted to serve as ultrasound sensitive ears, whereas ultrasound detection in water have called for sensory cells mechanically connected to highly specialized gas volumes that can oscillate at high frequencies. In addition, there are most likely differences in the risk of predation between insects and fish from echolocating predators. The selection pressure among insects for evolving ultrasound sensitive ears is high, because essentially all nocturnal predation on flying insects stems from echolocating bats. In the interaction between toothed whales and their prey the selection pressure seems weaker, because toothed whales are by no means the only marine predators placing a selection pressure on their prey to evolve specific means to detect and avoid them. Toothed whales can generate extremely intense sound pressure levels, and it has been suggested that they may use these to debilitate prey. Recent experiments, however, show that neither fish with swim bladders, nor squid are debilitated by such signals. This strongly suggests that the production of high amplitude ultrasonic clicks serve the function of improving the detection range of the toothed whale biosonar system rather than debilitation of prey.
How bats adapt their sonar behavior to accommodate the noisiness of a crowded day roost is a mystery. Some bats change their pulse acoustics to enhance the distinction between theirs and another bat’s echoes, but additional mechanisms are needed to explain the bat sonar system’s exceptional resilience to jamming by conspecifics. Variable pulse repetition rate strategies offer one potential solution to this dynamic problem, but precisely how changes in pulse rate could improve sonar performance in social settings is unclear. Here we show that bats decrease their emission rates as population density increases, following a pattern that reflects a cumulative mutual suppression of each other’s pulse emissions. Playback of artificially-generated echolocation pulses similarly slowed emission rates, demonstrating that suppression was mediated by hearing the pulses of other bats. Slower emission rates did not support an antiphonal emission strategy but did reduce the relative proportion of emitted pulses that overlapped with another bat’s emissions, reducing the relative rate of mutual interference. The prevalence of acoustic interferences occurring amongst bats was empirically determined to be a linear function of population density and mean emission rates. Consequently as group size increased, small reductions in emission rates spread across the group partially mitigated the increase in interference rate. Drawing on lessons learned from communications networking theory we show how modest decreases in pulse emission rates can significantly increase the net information throughput of the shared acoustic space, thereby improving sonar efficiency for all individuals in a group. We propose that an automated acoustic suppression of pulse emissions triggered by bats hearing each other’s emissions dynamically optimizes sonar efficiency for the entire group.
Echolocating bats use the time elapsed from biosonar pulse emission to the arrival of echo (defined as echo-delay) to assess target-distance. Target-distance is represented in the brain by delay-tuned neurons that are classified as either “heteroharmonic” or “homoharmormic.” Heteroharmonic neurons respond more strongly to pulse-echo pairs in which the timing of the pulse is given by the fundamental biosonar harmonic while the timing of echoes is provided by one (or several) of the higher order harmonics. On the other hand, homoharmonic neurons are tuned to the echo delay between similar harmonics in the emitted pulse and echo. It is generally accepted that heteroharmonic computations are advantageous over homoharmonic computations; i.e., heteroharmonic neurons receive information from call and echo in different frequency-bands which helps to avoid jamming between pulse and echo signals. Heteroharmonic neurons have been found in two species of the family Mormoopidae (Pteronotus parnellii and Pteronotus quadridens) and in Rhinolophus rouxi. Recently, it was proposed that heteroharmonic target-range computations are a primitive feature of the genus Pteronotus that was preserved in the evolution of the genus. Here, we review recent findings on the evolution of echolocation in Mormoopidae, and try to link those findings to the evolution of the heteroharmonic computation strategy (HtHCS). We stress the hypothesis that the ability to perform heteroharmonic computations evolved separately from the ability of using long constant-frequency echolocation calls, high duty cycle echolocation, and Doppler Shift Compensation. Also, we present the idea that heteroharmonic computations might have been of advantage for categorizing prey size, hunting eared insects, and living in large conspecific colonies. We make five testable predictions that might help future investigations to clarify the evolution of the heteroharmonic echolocation in Mormoopidae and other families.
Renal fibrosis represents a common pathway leading to progression of chronic kidney disease. Renal interstitial fibrosis is characterized by extensive fibroblast activation and excessive production and deposition of extracellular matrix (ECM), which leads to progressive loss of kidney function. There is no effective therapy available clinically to halt or even reverse renal fibrosis. Although activated fibroblasts/myofibroblasts are responsible for the excessive production and deposition of ECM, their origin remains controversial. Recent evidence suggests that bone marrow-derived fibroblast precursors contribute significantly to the pathogenesis of renal fibrosis. Understanding the molecular signaling mechanisms underlying the recruitment and activation of the bone marrow-derived fibroblast precursors will lead to novel therapy for the treatment of chronic kidney disease. In this review, we summarize recent advances in our understanding of the recruitment and activation of bone marrow-derived fibroblast precursors in the kidney and the development of renal fibrosis and highlights new insights that may lead to novel therapies to prevent or reverse the development of renal fibrosis.
Shifts in myosin heavy chain (MHC) expression within skeletal muscle can be induced by a host of stimuli including, but not limited to, physical activity, alterations in neural activity, aging, and diet or obesity. Here, we hypothesized that both age and a long-term (2 year) high fat/high sugar diet (HFS) would induce a slow to fast MHC shift within the plantaris, soleus, and extensor digitorum longus (EDL) muscles from rhesus monkeys. Furthermore, we tested whether supplementation with resveratrol, a naturally occurring compound that has been attributed with augmenting aerobic potential through mitochondrial proliferation, would counteract any diet-induced MHC changes by promoting a fast to slow isoform switch. In general, we found that MHC isoforms were not altered by aging during mid-life. The HFS diet had the largest impact within the soleus muscle where the greatest slow to fast isoform shifts were observed in both mRNA and protein indicators. As expected, long-term resveratrol treatment counteracted, or blunted, these diet-induced shifts within the soleus muscle. The plantaris muscle also demonstrated a fast-to-slow phenotypic response to resveratrol treatment. In conclusion, diet or resveratrol treatment impacts skeletal muscle phenotype in a muscle-specific manner and resveratrol supplementation may be one approach for promoting the fatigue-resistant MHC (type I) isoform especially if its expression is blunted as a result of a long-term high fat/sugar diet.
Chronic Fatigue Syndrome (CFS) is defined as greater than 6 months of persistent fatigue that is experienced physically and cognitively. The cognitive symptoms are generally thought to be a mild cognitive impairment, but individuals with CFS subjectively describe them as “brain fog.” The impairment is not fully understood and often is described as slow thinking, difficulty focusing, confusion, lack of concentration, forgetfulness, or a haziness in thought processes. Causes of “brain fog” and mild cognitive impairment have been investigated. Possible physiological correlates may be due to the effects of chronic orthostatic intolerance (OI) in the form of the Postural Tachycardia Syndrome (POTS) and decreases in cerebral blood flow (CBF). In addition, fMRI studies suggest that individuals with CFS may require increased cortical and subcortical brain activation to complete difficult mental tasks. Furthermore, neurocognitive testing in CFS has demonstrated deficits in speed and efficiency of information processing, attention, concentration, and working memory. The cognitive impairments are then perceived as an exaggerated mental fatigue. As a whole, this is experienced by those with CFS as “brain fog” and may be viewed as the interaction of physiological, cognitive, and perceptual factors. Thus, the cognitive symptoms of CFS may be due to altered CBF activation and regulation that are exacerbated by a stressor, such as orthostasis or a difficult mental task, resulting in the decreased ability to readily process information, which is then perceived as fatiguing and experienced as “brain fog.” Future research looks to further explore these interactions, how they produce cognitive impairments, and explain the perception of “brain fog” from a mechanistic standpoint.
Skeletal muscle responds to exercise-induced damage by orchestrating an adaptive process that protects the muscle from damage by subsequent bouts of exercise, a phenomenon called the repeated bout effect (RBE). The mechanisms underlying the RBE are not understood. We hypothesized that an attenuated inflammation response following a repeated bout of lengthening contractions (LC) would be coincidental with a RBE, suggesting a potential relationship. Fourteen men (n = 7) and women (n = 7) completed two bouts of lengthening contractions (LC) separated by 28 days. Muscle biopsies were taken before the first bout (B1) from the non-exercised leg, and from the exercised leg 2- and 27-d post-B1 and 2-d following the second bout (B2). A 29-plex cytokine array identified alterations in inflammatory cytokines. Immunohistochemistry quantified inflammatory cell infiltration and major histocompatibility complex class 1 (MHC-1). Muscle soreness was attenuated in the days following B2 relative to B1, indicating a RBE. Intramuscular monocyte chemoattractant protein (MCP1) and interferon gamma-induced protein 10 (IP10) increased following B2 relative to the pre-exercise sample (7-52 and 11-36 pg/ml, respectively p < 0.05). Interleukin 4 (IL4) decreased (26-13 pg/ml, p < 0.05) following B2 relative to the pre-exercise sample. Infiltration of CD68(+) macrophages and CD8(+) T-cells were evident following B2, but not B1. Moreover, CD8(+) T-cells were observed infiltrating apparently necrotic muscle fibers. No changes in MHC-1 were found. We conclude that inflammation is not attenuated following a repeated bout of LC and that CD8(+) T-cells may play a role in muscle adaptation following LC. Moreover, it appears that the muscle or the immune system becomes sensitized to an initial bout of damaging exercise such that inflammatory cell infiltration into the muscle is enhanced upon a repeated bout of damaging exercise.
During exercise the cardiovascular system has to warrant substrate supply to working muscle. The main function of red blood cells in exercise is the transport of O2 from the lungs to the tissues and the delivery of metabolically produced CO2 to the lungs for expiration. Hemoglobin also contributes to the blood’s buffering capacity, and ATP and NO release from red blood cells contributes to vasodilation and improved blood flow to working muscle. These functions require adequate amounts of red blood cells in circulation. Trained athletes, particularly in endurance sports, have a decreased hematocrit, which is sometimes called “sports anemia.” This is not anemia in a clinical sense, because athletes have in fact an increased total mass of red blood cells and hemoglobin in circulation relative to sedentary individuals. The slight decrease in hematocrit by training is brought about by an increased plasma volume (PV). The mechanisms that increase total red blood cell mass by training are not understood fully. Despite stimulated erythropoiesis, exercise can decrease the red blood cell mass by intravascular hemolysis mainly of senescent red blood cells, which is caused by mechanical rupture when red blood cells pass through capillaries in contracting muscles, and by compression of red cells e.g., in foot soles during running or in hand palms in weightlifters. Together, these adjustments cause a decrease in the average age of the population of circulating red blood cells in trained athletes. These younger red cells are characterized by improved oxygen release and deformability, both of which also improve tissue oxygen supply during exercise.
Physical activity is defined as any bodily movement produced by skeletal muscles that results in energy expenditure. The doubly labeled water method for the measurement of total energy expenditure (TEE), in combination with resting energy expenditure, is the reference for physical activity under free-living conditions. To compare the physical activity level (PAL) within and between species, TEE is divided by resting energy expenditure resulting in a figure without dimension. The PAL for sustainable lifestyles ranges between a minimum of 1.1-1.2 and a maximum of 2.0-2.5. The average PAL increases from 1.4 at age 1 year to 1.7-1.8 at reproductive age and declines again to 1.4 at age 90 year. Exercise training increases PAL in young adults when energy balance is maintained by increasing energy intake. Professional endurance athletes can reach PAL values around 4.0. Most of the variation in PAL between subjects can be ascribed to predisposition. A higher weight implicates higher movement costs and less body movement but not necessarily a lower PAL. Changes in physical activity primarily affect body composition and to a lesser extent body weight. Modern man has a similar PAL as a wild mammal of a similar body size.