ESMED small

Special Issue : Advancements in cardiology

SPECIAL ISSUE

Advancements in cardiology

RESEARCH ARTICLES

Mariano Ezequiel Napoli Llobera1, Lilia Luz Lobo Marquez2, Kari Kostiw3, Dave Webster4, Atilio Eugenio Costa Vitali5

1Research fellowship of Heart Failure Disease Management Program, Division of Cardiology, Health Sciences North Hospital, Sudbury, Ontario – Canada.
2Medical Director of Heart Failure Disease and Pulmonary Hypertension, Division of Cardiology, Cardiology Institute, San Miguel de Tucuman, Tucuman – Argentina.
3Clinical Manager of Heart Failure Disease Management Program, Division of Cardiology, Health Sciences North Hospital, Sudbury, Ontario – Canada.
4Director Nuclear Medicine, Division of Cardiology, Health Sciences North Hospital, Sudbury, Ontario – Canada.
5Medical Director of Heart Failure Disease Management Program, Division of Cardiology, Health Sciences North Hospital, Sudbury, Ontario – Canada.

Abstract

Background

Transthyretin Cardiac Amyloidosis (ATTR) is a type of restrictive cardiomyopathy, which typically manifests as Heart Failure with Preserved Ejection Fraction (HFpEF).

The presence of unexplained left ventricular hypertrophy (LVH) associated with HF and red flag manifestations could increase the diagnostic probability. However, the diagnostic prevalence of this triad by cardiac scintigraphy remains uncertain.

Methods

From August 1st to December 31st, 2021, 22 consecutive patients diagnosed with a HF (ejection fraction more than 40%), LVH with unexplained etiology and at least one red flag clinical manifestation, underwent pyrophosphate scintigraphy (99mTc-PYP). The patients were divided into two groups: “Positive” and “Negative” (as defined by grade 2 or 3 uptake). Multiple logistic models were made with variable 99mTC-PYP and explanatory variables.

Results

Among 22 patients, 15 had a positive 99mTc-PYP study for ATTR. The prevalence of ATTR using the triad of HFpEF, unexplained LVH and at least one red flag was 68% (CI 95%; 45-86%). Patients with 99mTc-PYP positive tended to be male, older, and with an aortic mean gradient and interventricular septum higher, as compared to the group with a negative study. The most frequent red flag clinical manifestations were proteinuria (55%) and pseudoinfarction pattern (55%). The presence of 2 or more red flags could increase the diagnostic probability of the test (OR 1.6 (CI95% 0.52-4.89).

Conclusions

The diagnostic probability of ATTR by 99mTc-PYP scan could increased when a clinical manifestation of a red flag was added to the suspected diagnosis of heart failure and left ventricular hypertrophy. The use of non-invasive techniques allows early identification and treatment of this underdiagnosed disease.

H. Dowse1, T. VanKirk2

1School of Biology and Ecology and Department of Mathematics and Statistics, University of Maine, Orono, Maine
2School of Occupational Therapy, College of Health and Pharmacy, Husson University, Bangor, Maine

Abstract

Our objective in this review is to summarize evidence of the strong cardiac rhythmicity-enhancing power of melatonin in the Drosophila melanogaster model system and discuss the implications of these findings in the context of fundamental cardiac pacemaker function and potential clinical applications. Drosophila has proven itself as an exceptional research organism given the far-reaching genetic and molecular tools it offers. We consider details of the fly’s myogenic, ion-channel-based pacemaker and summarize aspects of its neurohomonal control. Melatonin, in the context of cardiology, has predominately been associated with its antioxidant properties in the prevention of reperfusion damage after infarct, but we have strongly confirmed the few reports of its effect strengthening rhythmicity. We discuss our clear results showing that melatonin is capable of converting normal noisy heartbeat to an extremely regular oscillator. It rescues the very uneven beat of the hearts of flies bearing a serious mutation in a gene encoding one of its core pacemaker ion channels. Possible mechanisms for these effects are considered.

Marina M. Tabbara1,2, Javier González3, Gaetano Ciancio1,2,4

1Department of Surgery, University of Miami Miller School of Medicine; Miami, Florida
2Miami Transplant Institute, University of Miami Miller School of Medicine, Jackson Memorial Hospital; Miami, Florida
3Servicio de Urología, Unidad de Trasplante Renal, Hospital General Universitario Gregorio Marañón; Madrid, Spain
4Department of Urology, University of Miami Miller School of Medicine; Miami, Florida

Abstract

Renal cell carcinoma (RCC) accounts for 2-3% of all malignant disease in adults and has a propensity to infiltrate the surrounding adjacent structures with a biologic predisposition for vascular invasion. This tropism for the venous system facilitates propagation into the renal vein and inferior vena cava (IVC) in up to 25% of patients with RCC. Surgical resection remains the mainstay treatment for RCC with venous tumor thrombus (TT) extension and the only hope for a potential cure. Higher thrombus levels correlate with more advanced stages of disease and thus poorer survival rates. Although CPB with circulatory arrest has been successfully performed during resection of these tumors, its use remains controversial due to the risk of coagulopathy, platelet dysfunction, and central nervous system complications. Complete intraabdominal surgical excision of level III thrombi can be achieved without sternotomy and CPB by utilizing hepatic mobilization maneuvers. The purpose of this review is to provide an update on the surgical management of these difficult cases of RCC with supradiaphragmatic tumor thrombi, including a description of transplant-based techniques that avoid sternotomy and cardiopulmonary bypass (CPB), minimizing intra- and post-operative complications.

Andrea M.P. Romani1

1Department of Physiology and Biophysics, School of Medicine. Case Western Reserve University, Cleveland, OH 44106 – 4970

Abstract

The effect of alcohol consumption on cardiac and cardiovascular functions remains a point of contention in the medical field. The consumption of low or moderate amounts of alcohol has been largely associated with having beneficial effect on cardiac contractility and the cardiovascular system as a whole, owing to the detected vasodilatory effect exerted by the alcohol, and the reduction in mortality documented by several studies. In contrast, excessive alcohol consumption results in negative outcomes in both men and women, with cardiac arrhythmias and atrial fibrillation, abnormality in cardiac contraction leading to heart failure, and dilated cardiomyopathy, and overall cardiovascular dysfunctions, including hypertension. The main points of contention are two-fold: the dose of alcohol at which its perceived beneficial effects disappear and proper cardiac and cardiovascular functions become progressively impaired, and how to clinical and therapeutically address cardiac and cardiovascular pathologies in chronic alcoholics to ameliorate their general conditions and their prognosis. The present review aims at providing the reader with a general understanding of the effects of alcohol on the cardiovascular system and the pathophysiological mechanisms that lead to the most common cases of cardiac dysfunction, and highlight the current guidelines for treatment of alcoholic cardiomyopathy to ameliorate clinical presentation and prognosis in alcoholic patients.

José F Guadalajara-Boo1

1Instituto Nacional de Cardiología “Ignacio Chávez, México

Abstract

Foxglove has been used for over 200 years to reduce water retention in its early days; Since the end of the 19th century, the treatment of heart failure has shown good clinical results. At the beginning of the 20th century, the treatment of heart failure began with good results and in the middle of the century scientific studies were carried out to determine the direct effects on the heart. 10 years later, clinical experience and technological advances demonstrated a really good efficacy that the digitalis effect had for the failing heart, with a defect: it is an old drug that appears obsolete for its current management in the presence of great new drugs that have proven useful in heart failure.

From the last third of the 20th century, it was shown that the heart benefits from the combination are better than the drugs alone, so digitalis has a role that, in combination with the other drugs, significantly enhances the beneficial effects for the treatment of heart failure.

Melissa J. Kimlinger1, Eric H. Mace1, Raymond C. Harris1, Ming-Zhi Zhang1, Matthew B. Barajas1, Antonio Hernandez1, Frederic T. Billings1

1Vanderbilt University School of Medicine (MJK), the Department Surgery (EM), the Department of Medicine (RCH, MZ), and the Department of Anesthesiology (MBB, AH, FTB), Vanderbilt University Medical Center, Nashville, TN

Abstract

Introduction: Acute kidney injury (AKI) affects 10% of patients following major surgery and is independently associated with extra-renal organ injury, development of chronic kidney disease, and death. Perioperative renal ischemia and reperfusion (IR) contributes to AKI by, in part, increasing production of reactive oxygen species (ROS) and leading to oxidative damage. Variations in inhaled oxygen may mediate some aspects of IR injury by affecting tissue oxygenation, ROS production, and oxidative damage. We tested the hypothesis that provision of air (normoxia) compared to 100% oxygen (hyperoxia) during murine renal IR affects renal ROS production and oxidative damage.

Methods: We administered 100% oxygen or air to 8-9 week-old FVB/N mice and performed dorsal unilateral nephrectomy with contralateral renal ischemia/reperfusion surgery while mice spontaneously ventilated. We subjected mice to 30 minutes of ischemia and 30 minutes of reperfusion prior to sacrifice. We obtained an arterial blood gas (ABG) by performing sternotomy and left cardiac puncture. We stained the kidney with pimonidazole, a marker of tissue hypoxia; 4-HNE, a marker of ROS-production; and measured F 2 -isoprostanes in homogenized tissue to quantify oxidative damage.

Results: Hyperoxia during IR increased arterial oxygen content compared to normoxia, but both groups of mice were hypoventilating at the time of ABG sampling. Renal tissue hypoxia following reperfusion was similar in both treatment groups. ROS production was similar in the cortex of mice (3.8% area in hyperoxia vs. 3.1% in normoxia, P=0.19) but increased in the medulla of hyperoxia-treated animals (6.3% area in hyperoxia vs. 4.5% in nomoxia, P=0.02). Renal F 2 -isoprostanes were similar in treatment groups (2.2 pg/mg kidney in hyperoxia vs. 2.1 pg/mg in normoxia, P=0.40).

Conclusions: Hyperoxia during spontaneous ventilation in murine renal IR did not appear to affect renal hypoxia following reperfusion, but hyperoxia increased medullary ROS production compared to normoxia.

Magda Havas1 & Jeffrey Marrongelle2

1Trent School of the Environment, Trent University, Peterborough, ON, Canada
2Bioenergimed Metabolic Institute, Schyylkill Haven, PA, USA

Abstract

This is a double-blind, placebo-controlled replication of a study that we previously conducted in Colorado with 25 subjects designed to test the effect of radio frequency radiation (RFR) generated by the base station of a cordless phone on heart rate variability (HRV). In this study, we analyzed the response of 69 subjects between the ages of 26 and 80 in both Canada and the USA. Subjects were exposed to radiation for 3-min intervals generated by a 2.4-GHz cordless phone base station (3–8 microW/cm2). Prior to provocation we conducted an orthostatic test to assess the state of adrenal exhaustion, which interferes with a person’s ability to mount a response to a stressor. A few participants had a severe reaction to the RFR with an increase in heart rate and altered HRV indicative of an alarm response to stress. Based on the HRV analyses of the 69 subjects, 7% were classified as being “moderately to very sensitive”, 29% were “little to moderately sensitive”, 30% were “not to a little sensitive” and 6% were “unknown”. These results are not psychosomatic and are not due to electromagnetic interference. Twenty-five percent of the subjects’ self-proclaimed sensitivity corresponded to that based on the HRV analysis, while 32% overestimated their sensitivity and 42% did not know whether or not they were electrically sensitive. Of the 39 participants who claimed to experience some electrical hypersensitivity, 36% claimed they also reacted to a cordless phone and experienced heart symptoms and, of these, 64% were classified as having some degree of electrohypersensitivity (EHS) based on their HRV response. Novel findings include documentation of a delayed response to radiation. This protocol underestimates the reaction to electromagnetic radiation and may provide a false negative for those with a delayed reaction and/or with adrenal exhaustion. Orthostatic HRV testing combined with provocation testing may provide a useful diagnostic tool for some sufferers of EHS when they are exposed to electromagnetic radiation. It can be used to confirm EHS but not to reject EHS as a diagnosis since not everyone with EHS has an ANS reaction to electromagnetic radiation. 

Rita Marinheiro1, Leonor Parreira1, Cláudia Lopes1, Pedro Amador1, Dinis Mesquita1, José Farinha1, Ana Esteves1, Joana Ferreira1, Rui Coelho1, Jeni Quintal1, Rui Caria1

1Centro Hospitalar de Setubal, Cardiology Department, Setubal, Portugal

Abstract

Introduction: Most of avoidable defibrillator therapies can be reduced by evidence-based programming, but defining tachycardia configurations across all device manufacturers is not straightforward.

Aims: To determine if a uniform programming of tachycardia zones, independently of the manufacturer, result in a lower rate of avoidable shocks in primary-prevention heart failure (HF) patients and also if programming high-rate or delayed therapies can have some benefit.

Methods: Prospective cohort with historical controls. HF patients with a primary-prevention indication for a defibrillator were randomized to receive one of two new programming configurations (high-rate or delayed therapies). A historical cohort of patients with conventional programming was analyzed for comparison. The primary endpoint was any therapy [shock or anti-tachycardia pacing (ATP)] delivered. Secondary endpoints were appropriate shocks, appropriate ATP, appropriate therapies, inappropriate shocks, syncope and death.

Results: 89 patients were assigned for new programming group [high rate (n=47) or delayed therapy (n=42)]. They were compared with 94 historical patients with conventional programming. During a mean follow-up of 20±7 months, the new programming was associated with a reduction in any therapy (HR = 0.265, 95% CI 0.121-0.577, p=0.001), even after adjustment. Aproppriate ATP and any shock were also reduced. Syncope did not occur.  Sudden, cardiovascular and all-cause deaths were not different between the groups.  In the new programming group, neither highrate nor delayed programming were better than the other.

Conclusions: In our study, programming tachycardia zones homogeneously across all manufacturers was possible and resulted in a lower rate of therapies, shocks and appropriate ATP.

Deniz Karasoy1, Gunnar Hilmar Gislason2,5, Christian Torp-Pedersen2, Jonas Bjerring Olesen2, Matteo Anselmino3, Patricia Fruelund1,4, Jacob Moesgaard Larsen1,4, Sam Riahi1,4

1Department of Cardiology, Aalborg University Hospital, Denmark
2Department of Cardiology, Copenhagen University Hospital Herlev and Gentofte, Denmark
3Division of Cardiology, “Città della Salute e della Scienza di Torino” Hospital, Department of Medical Sciences, University of Turin, Italy
4Department of Clinical Medicine, Aalborg University, Denmark
5The Danish Heart Foundation, Copenhagen, Denmark

Abstract

Objective: We investigated the long-term cardiovascular outcomes associated with direct oral anticoagulants (DOACs), antiplatelets and No-Treatment compared to warfarin beyond 90-days after atrial fibrillation (AF) catheter ablation.

Methods: We identified 12,010 AF patients undergoing first-time ablation in Denmark (2002-2018) and analyzed stroke, serious bleeding, cardiovascular death and the composite of these three endpoints (MACE) by incidence rates (IR) per 1000 person-years and Cox proportional-hazard models.

Results: The median age was 62 years (interquartile range [IQR]: 54-68 years); 28.8% were female, 7225 (60.2%) patients were younger than 65-years, and 6927 (57.7%) patients had CHA2DS2-VASc score≥2. Over a total of 65,990 person-years follow-up commencing 90-days after first-time ablation, warfarin, DOACs, antiplatelets and ‘No-treatment’ exposures covered 30,877 (46.8%), 9,452 (14.3%), 6,003 (9.1%) and 19,657 (29.8%) person-years, respectively. There was no difference between DOACs vs warfarin (HR 1.04 [0.77-1.42]95%CI) while antiplatelets (HR 1.50 [1.11-2.05]95%CI) and No-Treatment (HR 1.50 [1.15-1.94]95%CI) were associated with a significantly higher rate of stroke. DOACs (HR 0.70 [0.54-0.92]95%CI), antiplatelets (HR 0.58 [0.41-0.82]95%CI) and No-Treatment (HR 0.52 [0.39-0.69]95%CI) were associated with a significantly lower rate of serious bleeding compared with warfarin. We found no difference between DOACs and warfarin (HR 0.87 [0.61-1.25]95%CI) while Antiplatelets (HR 1.42 [1.04-1.94]95%CI) and No-treatment (HR 2.77 [2.16-3.56]95%CI) were associated with a significantly higher rate of cardiovascular death. We observed no difference with DOACs (HR 0.86 [0.70-1.05]95%CI), antiplatelets (HR 1.04 [0.84-1.27]95%CI) or No-Treatment (HR 1.10 [0.93-1.31]]95%CI) compared to warfarin in multivariable analyses regarding the composite endpoint of MACE.

Conclusions: Our study indicates a better bleeding risk profile associated with DOACs than warfarin in patients undergoing AF ablation, but no difference for the endpoints of stroke, cardiovascular death, or the composite endpoint of MACE. Despite the favourable bleeding risk, antiplatelets and No-Treatment compared with warfarin appear hazardous due to a higher rate of stroke and cardiovascular death.

Arthur E. Frankel1, Tazio Capozzola2, Raiees Andrabi2, Chul Ahn3, Panpan Zhou2, Wan-ting He2, Dennis R. Burton2

1Department of Medicine, West Palm Beach VA Medical Center, West Palm Beach, FL
2The Scripps Research Institute, La Jolla, CA, USA
3Division of Biostatistics, Department of Population and Data Sciences, University of Texas Southwestern Medical School, Dallas,TX, USA

Abstract

Immunocompromised cancer patients are at significant risk of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection. A method to identify those patients at highest risk is needed so that prophylactic measures may be employed. Serum antibodies to SARS-CoV-2 spike protein are important markers of protection against COVID-19 disease. We evaluated total and neutralizing antibody levels pre and post third booster vaccine and compared responses among different cancer-bearing and healthy veterans. This as a prospective, single site, comparative cohort observational trial. The setting was the West Palm Beach VA Medical Center cancer center. All veterans received a third SARS-CoV-2 mRNA booster. The main outcomes were anti-SARS-CoV-2 spike IgG and neutralizing antibodies to wild-type, and B.1.617, BA1, BA2, and BA4/5 variants were measured. Disease type and therapy, COVID-19 infection, and anti-CD20 antibody treatments were documented. The third mRNA vaccine booster increased the mean blood anti-spike IgG five-fold. The second anti-spike level was equal or greater than the first in 129/140 veterans. All the groups except the myeloma group, had post-booster antibody levels significantly higher than pre-booster with 4-fold, 12-fold, 4-fold, 6-fold and 3.5-fold increases for the control, solid tumor, CLL, B cell lymphoma and all B cell malignancy cohorts. The myeloma set showed only a non-significant 1.7-fold increase. Recently anti-CD20 antibody-treated patients were shown to have approximately 200-fold less anti-S IgG production after vaccine booster than other patients. There was a 2.5-fold enhancement of wild-type virus mean neutralizing antibodies after a third mRNA booster and mean neutralization of Delta and Omicron variants increased 2.2, 6.5, 7.7, and 6.2-fold versus pre-boost levels. B cell malignancies failed to show increased post-booster neutralization. The third SARS CoV-2 booster increased total anti-spike IgG and neutralizing antibodies for most subjects. Veterans with B cell malignancies particularly myeloma and those receiving anti-CD20 monoclonal antibodies had the weakest humoral responses. Neutralizing antibody responses to Omicron variants were less than for wild-type virus. A subset of patients without humoral immunity post-booster should be considered for prophylactic antibody or close monitoring.

Angel Martin Castellanos1,2

1Cardiovascular Sciences and Sports Medicine Center, Caceres, Spain.
2Department of Anatomy. Research Group in Bio-Anthropology and Cardiovascular Sciences, University of Extremadura, Faculty of Nursing and Occupational Therapy, Caceres, Spain.

Abstract

Despite the impact of the COVID‑19 pandemic, myocardial infarction remains the leading cause of cardiovascular deaths in Europe. Body mass index (BMI)-defined obesity is a major risk factor for myocardial infarction. However, in the association of anthropometrics and myocardial infarction, the lack of balance between the simple body measurements when comparing healthy and unhealthy cases has demonstrated that affects the outcome. Thus, regardless of association strength of anthropometrics, other criteria to judge the biological causality must be investigated.

We aim to assess different studies worldwide to understand the key concepts to demonstrate association biases for anthropometrics when predicting myocardial infarction risk. In this approach, natural mathematical inequalities between simple measurements in healthy subjects were investigated. Weight, height, height/2, waist circumference and hip circumference mathematically represent absolute values that do not express mathematically equality for the true risk. That way, the mathematical concept of fraction or ratio in anthropometrics such as BMI, waist-to-hip ratio (WHR) or waist-to-height ratio (WHtR) plays an important role. Thus, some anthropometrics may be seen as confounding variables when measuring high-risk body composition. Weight is a confounding factor without indicating a high-risk body composition, meaning that BMI is not fully predictive. WHR is a confounding variable concerning waist and WHtR due to imbalances between the mean hip–waist and hip–height, respectively, which indicates a protective overestimation for hip concerning waist and height. Waist measure may be a confounding variable concerning WHtR due to an imbalance in the mean waist–height. This occurs if, and only if, WHtR risk cut-off is >0.5 and if height is ignored as volume factor, therefore creating an overestimation of risk for waist circumference in the tallest people and underestimation in the shortest. Mathematically/anthropometrically, only WHtR-associated risk above BMI, waist and WHR holds true while considering it as a relative risk volume linked to a causal pathway of higher cardiometabolic risk.

In conclusion, WHtR is the only metric that is directly associated to a risk volume and having more biological plausibility. It should be used to assess the anthropometrically-measured myocardial infarction risk, once the imbalances between measurements and association biases are recognised.

Enrique C. Morales-Villegas1, Luis A. Alcocer-Diaz-Barreiro2, Gualberto Moreno-Virgen1

1 Cardiometabolic Research Center-MAC Hospital. Aguascalientes, México.
2Mexican Institute of Cardiovascular Health.

Abstract

In patients with hypertension (HT), cardiovascular risk reduction is directly proportional to the reduction in blood pressure sustained over time. However, in “real life,” blood pressure control is often insufficient or not sustained over time to achieve optimal cardiovascular risk reduction. In this article, we comment on the multiple reasons which explain this common therapeutic failure.
 
Also, in this article, we summarize the amazing basic and clinical phase III evidence of azilsartan (AZL) and azilsartan combined with chlortalidone (CLD), two excellent therapeutic options for HT control. With such evidence as scientific background, we communicate our results with almost 300 HT patients treated with azilsartan and azilsartan/chlortalidone in “real life.” In brief, our findings were the following:
 
  1. a) In HT patients with blood pressure (BP) <150/90 mmHg, AZL 40 mg as monotherapy provides practically 100% success to achieve a target BP <140/90 and <130/80 mmHg, in a subpopulation that we have called “hyper-responders”
  2. b) In HT patients with BP <150/90 mmHg (naive or with another treatment failure), AZL/CLD 40/12.5 mg provides practically 100% success to achieve a target BP <140/90 mmHg and 90% to achieve a target BP <130/80 mmHg;
  3. c) In HT patients with BP >150/90 mmHg (generally with another treatment failure), AZL/CLD 80/12.5 mg gives women a success rate greater than 60% to achieve a target BP <140/90 mmHg and greater than 50% to achieve a target BP <130/80 mmHg. The success rates were higher in men, greater than 75% to achieve a target BP <140/90 mmHg and greater than 60% to achieve a target BP <130/80 mmHg. In both cases, the use of amlodipine (2.5, 5, or 10 mg) made it possible to achieve a target BP <140/90 mmHg in 100% of the cases and <130/80 mmHg in 80% of the cases.

Finally, according to our results, we propose a simple three-step strategy based on evidence, personalization, and empowerment which allows reaching a target BP <140/90 mmHg in more than 90% of cases and a target BP <130/80 mmHg in more than 75% of cases in 4 to 12 weeks.

REVIEW ARTICLES

Shunsuke Kiuchi1, Takanori Ikeda1

1Department of Cardiovascular Medicine, Toho University Graduate School of Medicine, Tokyo, Japan

Abstract

There had been no effective cardioprotective medications for heart failure with preserved ejection fraction (HFpEF). Therefore, treatment intervention at the hypertension (HT) stage (stage A), which is a major factor in HFpEF, is necessary. In fact, the SPRINT and STEP trials reported that strict and intensive blood pressure (BP) control was useful, reducing approximately 25% of the primary endpoints, including cardiovascular events. The effectiveness of BP reduction for HFpEF after the onset of HF (stage C or D) has been reported and shown to generally follow the J-curve phenomenon. Both left ventricular systolic/diastolic dysfunction and vascular failure are related with the pathophysiology of HF. In the case of coexisting vascular failure, BP lowering treatment is effective, because it decreases the afterload. However, BP lowering treatment has been reported to increase the incidence of renal dysfunction; therefore, paying attention to the degree of association with vascular failure, and multiple organs when determining the target BP are important to consider. The decision on the target BP and the optimal choice of cardioprotective/antihypertensive medications for HF should be based on the pathologic condition.

Margaret V Ragni1

1University of Pittsburgh

Abstract

This is an exciting time in hemophilia treatment with the unprecedented development of novel non-factor therapies. These agents have re-balanced hemostasis in patients with hemophilia A and B, with and without inhibitors, tipping the balance toward hemostasis and improved thrombin generation. While there have been numerous publications about the beneficial hemostatic effects and significant bleed reduction possible with these novel non-factor agents, little has been written about the less well-recognized thrombotic complications. Yet, the latter underscores the fine balance between hemostasis and thrombosis and the fact these agents prevent but do not treat bleeds, requiring clotting factor requirement to treat acute bleeds. The purpose of this Commentary is to review thrombotic complications that have occurred with non-factor therapies, risk factors for thrombosis, potential mechanisms, and potential mitigation approaches.

Shunsuke Kiuchi1, Takanori Ikeda1

1Department of Cardiovascular Medicine, Toho University Graduate School of Medicine, Tokyo, Japan

Abstract

There had been no effective cardioprotective medications for heart failure with preserved ejection fraction (HFpEF). Therefore, treatment intervention at the hypertension (HT) stage (stage A), which is a major factor in HFpEF, is necessary. In fact, the SPRINT and STEP trials reported that strict and intensive blood pressure (BP) control was useful, reducing approximately 25% of the primary endpoints, including cardiovascular events. The effectiveness of BP reduction for HFpEF after the onset of HF (stage C or D) has been reported and shown to generally follow the J-curve phenomenon. Both left ventricular systolic/diastolic dysfunction and vascular failure are related with the pathophysiology of HF. In the case of coexisting vascular failure, BP lowering treatment is effective, because it decreases the afterload. However, BP lowering treatment has been reported to increase the incidence of renal dysfunction; therefore, paying attention to the degree of association with vascular failure, and multiple organs when determining the target BP are important to consider. The decision on the target BP and the optimal choice of cardioprotective/antihypertensive medications for HF should be based on the pathologic condition.

SPECIAL ISSUE

Advancements in cardiology

RESEARCH ARTICLES

Mariano Ezequiel Napoli Llobera1, Lilia Luz Lobo Marquez2, Kari Kostiw3, Dave Webster4, Atilio Eugenio Costa Vitali5

1Research fellowship of Heart Failure Disease Management Program, Division of Cardiology, Health Sciences North Hospital, Sudbury, Ontario – Canada.
2Medical Director of Heart Failure Disease and Pulmonary Hypertension, Division of Cardiology, Cardiology Institute, San Miguel de Tucuman, Tucuman – Argentina.
3Clinical Manager of Heart Failure Disease Management Program, Division of Cardiology, Health Sciences North Hospital, Sudbury, Ontario – Canada.
4Director Nuclear Medicine, Division of Cardiology, Health Sciences North Hospital, Sudbury, Ontario – Canada.
5Medical Director of Heart Failure Disease Management Program, Division of Cardiology, Health Sciences North Hospital, Sudbury, Ontario – Canada.

Abstract

Background

Transthyretin Cardiac Amyloidosis (ATTR) is a type of restrictive cardiomyopathy, which typically manifests as Heart Failure with Preserved Ejection Fraction (HFpEF).

The presence of unexplained left ventricular hypertrophy (LVH) associated with HF and red flag manifestations could increase the diagnostic probability. However, the diagnostic prevalence of this triad by cardiac scintigraphy remains uncertain.

Methods

From August 1st to December 31st, 2021, 22 consecutive patients diagnosed with a HF (ejection fraction more than 40%), LVH with unexplained etiology and at least one red flag clinical manifestation, underwent pyrophosphate scintigraphy (99mTc-PYP). The patients were divided into two groups: “Positive” and “Negative” (as defined by grade 2 or 3 uptake). Multiple logistic models were made with variable 99mTC-PYP and explanatory variables.

Results

Among 22 patients, 15 had a positive 99mTc-PYP study for ATTR. The prevalence of ATTR using the triad of HFpEF, unexplained LVH and at least one red flag was 68% (CI 95%; 45-86%). Patients with 99mTc-PYP positive tended to be male, older, and with an aortic mean gradient and interventricular septum higher, as compared to the group with a negative study. The most frequent red flag clinical manifestations were proteinuria (55%) and pseudoinfarction pattern (55%). The presence of 2 or more red flags could increase the diagnostic probability of the test (OR 1.6 (CI95% 0.52-4.89).

Conclusions

The diagnostic probability of ATTR by 99mTc-PYP scan could increased when a clinical manifestation of a red flag was added to the suspected diagnosis of heart failure and left ventricular hypertrophy. The use of non-invasive techniques allows early identification and treatment of this underdiagnosed disease.

H. Dowse1, T. VanKirk2

1School of Biology and Ecology and Department of Mathematics and Statistics, University of Maine, Orono, Maine
2School of Occupational Therapy, College of Health and Pharmacy, Husson University, Bangor, Maine

Abstract

Our objective in this review is to summarize evidence of the strong cardiac rhythmicity-enhancing power of melatonin in the Drosophila melanogaster model system and discuss the implications of these findings in the context of fundamental cardiac pacemaker function and potential clinical applications. Drosophila has proven itself as an exceptional research organism given the far-reaching genetic and molecular tools it offers. We consider details of the fly’s myogenic, ion-channel-based pacemaker and summarize aspects of its neurohomonal control. Melatonin, in the context of cardiology, has predominately been associated with its antioxidant properties in the prevention of reperfusion damage after infarct, but we have strongly confirmed the few reports of its effect strengthening rhythmicity. We discuss our clear results showing that melatonin is capable of converting normal noisy heartbeat to an extremely regular oscillator. It rescues the very uneven beat of the hearts of flies bearing a serious mutation in a gene encoding one of its core pacemaker ion channels. Possible mechanisms for these effects are considered.

Marina M. Tabbara1,2, Javier González3, Gaetano Ciancio1,2,4

1Department of Surgery, University of Miami Miller School of Medicine; Miami, Florida
2Miami Transplant Institute, University of Miami Miller School of Medicine, Jackson Memorial Hospital; Miami, Florida
3Servicio de Urología, Unidad de Trasplante Renal, Hospital General Universitario Gregorio Marañón; Madrid, Spain
4Department of Urology, University of Miami Miller School of Medicine; Miami, Florida

Abstract

Renal cell carcinoma (RCC) accounts for 2-3% of all malignant disease in adults and has a propensity to infiltrate the surrounding adjacent structures with a biologic predisposition for vascular invasion. This tropism for the venous system facilitates propagation into the renal vein and inferior vena cava (IVC) in up to 25% of patients with RCC. Surgical resection remains the mainstay treatment for RCC with venous tumor thrombus (TT) extension and the only hope for a potential cure. Higher thrombus levels correlate with more advanced stages of disease and thus poorer survival rates. Although CPB with circulatory arrest has been successfully performed during resection of these tumors, its use remains controversial due to the risk of coagulopathy, platelet dysfunction, and central nervous system complications. Complete intraabdominal surgical excision of level III thrombi can be achieved without sternotomy and CPB by utilizing hepatic mobilization maneuvers. The purpose of this review is to provide an update on the surgical management of these difficult cases of RCC with supradiaphragmatic tumor thrombi, including a description of transplant-based techniques that avoid sternotomy and cardiopulmonary bypass (CPB), minimizing intra- and post-operative complications.

Andrea M.P. Romani1

1Department of Physiology and Biophysics, School of Medicine. Case Western Reserve University, Cleveland, OH 44106 – 4970

Abstract

The effect of alcohol consumption on cardiac and cardiovascular functions remains a point of contention in the medical field. The consumption of low or moderate amounts of alcohol has been largely associated with having beneficial effect on cardiac contractility and the cardiovascular system as a whole, owing to the detected vasodilatory effect exerted by the alcohol, and the reduction in mortality documented by several studies. In contrast, excessive alcohol consumption results in negative outcomes in both men and women, with cardiac arrhythmias and atrial fibrillation, abnormality in cardiac contraction leading to heart failure, and dilated cardiomyopathy, and overall cardiovascular dysfunctions, including hypertension. The main points of contention are two-fold: the dose of alcohol at which its perceived beneficial effects disappear and proper cardiac and cardiovascular functions become progressively impaired, and how to clinical and therapeutically address cardiac and cardiovascular pathologies in chronic alcoholics to ameliorate their general conditions and their prognosis. The present review aims at providing the reader with a general understanding of the effects of alcohol on the cardiovascular system and the pathophysiological mechanisms that lead to the most common cases of cardiac dysfunction, and highlight the current guidelines for treatment of alcoholic cardiomyopathy to ameliorate clinical presentation and prognosis in alcoholic patients.

José F Guadalajara-Boo1

1Instituto Nacional de Cardiología “Ignacio Chávez, México

Abstract

Foxglove has been used for over 200 years to reduce water retention in its early days; Since the end of the 19th century, the treatment of heart failure has shown good clinical results. At the beginning of the 20th century, the treatment of heart failure began with good results and in the middle of the century scientific studies were carried out to determine the direct effects on the heart. 10 years later, clinical experience and technological advances demonstrated a really good efficacy that the digitalis effect had for the failing heart, with a defect: it is an old drug that appears obsolete for its current management in the presence of great new drugs that have proven useful in heart failure.

From the last third of the 20th century, it was shown that the heart benefits from the combination are better than the drugs alone, so digitalis has a role that, in combination with the other drugs, significantly enhances the beneficial effects for the treatment of heart failure.

Melissa J. Kimlinger1, Eric H. Mace1, Raymond C. Harris1, Ming-Zhi Zhang1, Matthew B. Barajas1, Antonio Hernandez1, Frederic T. Billings1

1Vanderbilt University School of Medicine (MJK), the Department Surgery (EM), the Department of Medicine (RCH, MZ), and the Department of Anesthesiology (MBB, AH, FTB), Vanderbilt University Medical Center, Nashville, TN

Abstract

Introduction: Acute kidney injury (AKI) affects 10% of patients following major surgery and is independently associated with extra-renal organ injury, development of chronic kidney disease, and death. Perioperative renal ischemia and reperfusion (IR) contributes to AKI by, in part, increasing production of reactive oxygen species (ROS) and leading to oxidative damage. Variations in inhaled oxygen may mediate some aspects of IR injury by affecting tissue oxygenation, ROS production, and oxidative damage. We tested the hypothesis that provision of air (normoxia) compared to 100% oxygen (hyperoxia) during murine renal IR affects renal ROS production and oxidative damage.

Methods: We administered 100% oxygen or air to 8-9 week-old FVB/N mice and performed dorsal unilateral nephrectomy with contralateral renal ischemia/reperfusion surgery while mice spontaneously ventilated. We subjected mice to 30 minutes of ischemia and 30 minutes of reperfusion prior to sacrifice. We obtained an arterial blood gas (ABG) by performing sternotomy and left cardiac puncture. We stained the kidney with pimonidazole, a marker of tissue hypoxia; 4-HNE, a marker of ROS-production; and measured F 2 -isoprostanes in homogenized tissue to quantify oxidative damage.

Results: Hyperoxia during IR increased arterial oxygen content compared to normoxia, but both groups of mice were hypoventilating at the time of ABG sampling. Renal tissue hypoxia following reperfusion was similar in both treatment groups. ROS production was similar in the cortex of mice (3.8% area in hyperoxia vs. 3.1% in normoxia, P=0.19) but increased in the medulla of hyperoxia-treated animals (6.3% area in hyperoxia vs. 4.5% in nomoxia, P=0.02). Renal F 2 -isoprostanes were similar in treatment groups (2.2 pg/mg kidney in hyperoxia vs. 2.1 pg/mg in normoxia, P=0.40).

Conclusions: Hyperoxia during spontaneous ventilation in murine renal IR did not appear to affect renal hypoxia following reperfusion, but hyperoxia increased medullary ROS production compared to normoxia.

Magda Havas1 & Jeffrey Marrongelle2

1Trent School of the Environment, Trent University, Peterborough, ON, Canada
2Bioenergimed Metabolic Institute, Schyylkill Haven, PA, USA

Abstract

This is a double-blind, placebo-controlled replication of a study that we previously conducted in Colorado with 25 subjects designed to test the effect of radio frequency radiation (RFR) generated by the base station of a cordless phone on heart rate variability (HRV). In this study, we analyzed the response of 69 subjects between the ages of 26 and 80 in both Canada and the USA. Subjects were exposed to radiation for 3-min intervals generated by a 2.4-GHz cordless phone base station (3–8 microW/cm2). Prior to provocation we conducted an orthostatic test to assess the state of adrenal exhaustion, which interferes with a person’s ability to mount a response to a stressor. A few participants had a severe reaction to the RFR with an increase in heart rate and altered HRV indicative of an alarm response to stress. Based on the HRV analyses of the 69 subjects, 7% were classified as being “moderately to very sensitive”, 29% were “little to moderately sensitive”, 30% were “not to a little sensitive” and 6% were “unknown”. These results are not psychosomatic and are not due to electromagnetic interference. Twenty-five percent of the subjects’ self-proclaimed sensitivity corresponded to that based on the HRV analysis, while 32% overestimated their sensitivity and 42% did not know whether or not they were electrically sensitive. Of the 39 participants who claimed to experience some electrical hypersensitivity, 36% claimed they also reacted to a cordless phone and experienced heart symptoms and, of these, 64% were classified as having some degree of electrohypersensitivity (EHS) based on their HRV response. Novel findings include documentation of a delayed response to radiation. This protocol underestimates the reaction to electromagnetic radiation and may provide a false negative for those with a delayed reaction and/or with adrenal exhaustion. Orthostatic HRV testing combined with provocation testing may provide a useful diagnostic tool for some sufferers of EHS when they are exposed to electromagnetic radiation. It can be used to confirm EHS but not to reject EHS as a diagnosis since not everyone with EHS has an ANS reaction to electromagnetic radiation. 

Rita Marinheiro1, Leonor Parreira1, Cláudia Lopes1, Pedro Amador1, Dinis Mesquita1, José Farinha1, Ana Esteves1, Joana Ferreira1, Rui Coelho1, Jeni Quintal1, Rui Caria1

1Centro Hospitalar de Setubal, Cardiology Department, Setubal, Portugal

Abstract

Introduction: Most of avoidable defibrillator therapies can be reduced by evidence-based programming, but defining tachycardia configurations across all device manufacturers is not straightforward.

Aims: To determine if a uniform programming of tachycardia zones, independently of the manufacturer, result in a lower rate of avoidable shocks in primary-prevention heart failure (HF) patients and also if programming high-rate or delayed therapies can have some benefit.

Methods: Prospective cohort with historical controls. HF patients with a primary-prevention indication for a defibrillator were randomized to receive one of two new programming configurations (high-rate or delayed therapies). A historical cohort of patients with conventional programming was analyzed for comparison. The primary endpoint was any therapy [shock or anti-tachycardia pacing (ATP)] delivered. Secondary endpoints were appropriate shocks, appropriate ATP, appropriate therapies, inappropriate shocks, syncope and death.

Results: 89 patients were assigned for new programming group [high rate (n=47) or delayed therapy (n=42)]. They were compared with 94 historical patients with conventional programming. During a mean follow-up of 20±7 months, the new programming was associated with a reduction in any therapy (HR = 0.265, 95% CI 0.121-0.577, p=0.001), even after adjustment. Aproppriate ATP and any shock were also reduced. Syncope did not occur.  Sudden, cardiovascular and all-cause deaths were not different between the groups.  In the new programming group, neither highrate nor delayed programming were better than the other.

Conclusions: In our study, programming tachycardia zones homogeneously across all manufacturers was possible and resulted in a lower rate of therapies, shocks and appropriate ATP.

Deniz Karasoy1, Gunnar Hilmar Gislason2,5, Christian Torp-Pedersen2, Jonas Bjerring Olesen2, Matteo Anselmino3, Patricia Fruelund1,4, Jacob Moesgaard Larsen1,4, Sam Riahi1,4

1Department of Cardiology, Aalborg University Hospital, Denmark
2Department of Cardiology, Copenhagen University Hospital Herlev and Gentofte, Denmark
3Division of Cardiology, “Città della Salute e della Scienza di Torino” Hospital, Department of Medical Sciences, University of Turin, Italy
4Department of Clinical Medicine, Aalborg University, Denmark
5The Danish Heart Foundation, Copenhagen, Denmark

Abstract

Objective: We investigated the long-term cardiovascular outcomes associated with direct oral anticoagulants (DOACs), antiplatelets and No-Treatment compared to warfarin beyond 90-days after atrial fibrillation (AF) catheter ablation.

Methods: We identified 12,010 AF patients undergoing first-time ablation in Denmark (2002-2018) and analyzed stroke, serious bleeding, cardiovascular death and the composite of these three endpoints (MACE) by incidence rates (IR) per 1000 person-years and Cox proportional-hazard models.

Results: The median age was 62 years (interquartile range [IQR]: 54-68 years); 28.8% were female, 7225 (60.2%) patients were younger than 65-years, and 6927 (57.7%) patients had CHA2DS2-VASc score≥2. Over a total of 65,990 person-years follow-up commencing 90-days after first-time ablation, warfarin, DOACs, antiplatelets and ‘No-treatment’ exposures covered 30,877 (46.8%), 9,452 (14.3%), 6,003 (9.1%) and 19,657 (29.8%) person-years, respectively. There was no difference between DOACs vs warfarin (HR 1.04 [0.77-1.42]95%CI) while antiplatelets (HR 1.50 [1.11-2.05]95%CI) and No-Treatment (HR 1.50 [1.15-1.94]95%CI) were associated with a significantly higher rate of stroke. DOACs (HR 0.70 [0.54-0.92]95%CI), antiplatelets (HR 0.58 [0.41-0.82]95%CI) and No-Treatment (HR 0.52 [0.39-0.69]95%CI) were associated with a significantly lower rate of serious bleeding compared with warfarin. We found no difference between DOACs and warfarin (HR 0.87 [0.61-1.25]95%CI) while Antiplatelets (HR 1.42 [1.04-1.94]95%CI) and No-treatment (HR 2.77 [2.16-3.56]95%CI) were associated with a significantly higher rate of cardiovascular death. We observed no difference with DOACs (HR 0.86 [0.70-1.05]95%CI), antiplatelets (HR 1.04 [0.84-1.27]95%CI) or No-Treatment (HR 1.10 [0.93-1.31]]95%CI) compared to warfarin in multivariable analyses regarding the composite endpoint of MACE.

Conclusions: Our study indicates a better bleeding risk profile associated with DOACs than warfarin in patients undergoing AF ablation, but no difference for the endpoints of stroke, cardiovascular death, or the composite endpoint of MACE. Despite the favourable bleeding risk, antiplatelets and No-Treatment compared with warfarin appear hazardous due to a higher rate of stroke and cardiovascular death.

Arthur E. Frankel1, Tazio Capozzola2, Raiees Andrabi2, Chul Ahn3, Panpan Zhou2, Wan-ting He2, Dennis R. Burton2

1Department of Medicine, West Palm Beach VA Medical Center, West Palm Beach, FL
2The Scripps Research Institute, La Jolla, CA, USA
3Division of Biostatistics, Department of Population and Data Sciences, University of Texas Southwestern Medical School, Dallas,TX, USA

Abstract

Immunocompromised cancer patients are at significant risk of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection. A method to identify those patients at highest risk is needed so that prophylactic measures may be employed. Serum antibodies to SARS-CoV-2 spike protein are important markers of protection against COVID-19 disease. We evaluated total and neutralizing antibody levels pre and post third booster vaccine and compared responses among different cancer-bearing and healthy veterans. This as a prospective, single site, comparative cohort observational trial. The setting was the West Palm Beach VA Medical Center cancer center. All veterans received a third SARS-CoV-2 mRNA booster. The main outcomes were anti-SARS-CoV-2 spike IgG and neutralizing antibodies to wild-type, and B.1.617, BA1, BA2, and BA4/5 variants were measured. Disease type and therapy, COVID-19 infection, and anti-CD20 antibody treatments were documented. The third mRNA vaccine booster increased the mean blood anti-spike IgG five-fold. The second anti-spike level was equal or greater than the first in 129/140 veterans. All the groups except the myeloma group, had post-booster antibody levels significantly higher than pre-booster with 4-fold, 12-fold, 4-fold, 6-fold and 3.5-fold increases for the control, solid tumor, CLL, B cell lymphoma and all B cell malignancy cohorts. The myeloma set showed only a non-significant 1.7-fold increase. Recently anti-CD20 antibody-treated patients were shown to have approximately 200-fold less anti-S IgG production after vaccine booster than other patients. There was a 2.5-fold enhancement of wild-type virus mean neutralizing antibodies after a third mRNA booster and mean neutralization of Delta and Omicron variants increased 2.2, 6.5, 7.7, and 6.2-fold versus pre-boost levels. B cell malignancies failed to show increased post-booster neutralization. The third SARS CoV-2 booster increased total anti-spike IgG and neutralizing antibodies for most subjects. Veterans with B cell malignancies particularly myeloma and those receiving anti-CD20 monoclonal antibodies had the weakest humoral responses. Neutralizing antibody responses to Omicron variants were less than for wild-type virus. A subset of patients without humoral immunity post-booster should be considered for prophylactic antibody or close monitoring.

Angel Martin Castellanos1,2

1Cardiovascular Sciences and Sports Medicine Center, Caceres, Spain.
2Department of Anatomy. Research Group in Bio-Anthropology and Cardiovascular Sciences, University of Extremadura, Faculty of Nursing and Occupational Therapy, Caceres, Spain.

Abstract

Despite the impact of the COVID‑19 pandemic, myocardial infarction remains the leading cause of cardiovascular deaths in Europe. Body mass index (BMI)-defined obesity is a major risk factor for myocardial infarction. However, in the association of anthropometrics and myocardial infarction, the lack of balance between the simple body measurements when comparing healthy and unhealthy cases has demonstrated that affects the outcome. Thus, regardless of association strength of anthropometrics, other criteria to judge the biological causality must be investigated.

We aim to assess different studies worldwide to understand the key concepts to demonstrate association biases for anthropometrics when predicting myocardial infarction risk. In this approach, natural mathematical inequalities between simple measurements in healthy subjects were investigated. Weight, height, height/2, waist circumference and hip circumference mathematically represent absolute values that do not express mathematically equality for the true risk. That way, the mathematical concept of fraction or ratio in anthropometrics such as BMI, waist-to-hip ratio (WHR) or waist-to-height ratio (WHtR) plays an important role. Thus, some anthropometrics may be seen as confounding variables when measuring high-risk body composition. Weight is a confounding factor without indicating a high-risk body composition, meaning that BMI is not fully predictive. WHR is a confounding variable concerning waist and WHtR due to imbalances between the mean hip–waist and hip–height, respectively, which indicates a protective overestimation for hip concerning waist and height. Waist measure may be a confounding variable concerning WHtR due to an imbalance in the mean waist–height. This occurs if, and only if, WHtR risk cut-off is >0.5 and if height is ignored as volume factor, therefore creating an overestimation of risk for waist circumference in the tallest people and underestimation in the shortest. Mathematically/anthropometrically, only WHtR-associated risk above BMI, waist and WHR holds true while considering it as a relative risk volume linked to a causal pathway of higher cardiometabolic risk.

In conclusion, WHtR is the only metric that is directly associated to a risk volume and having more biological plausibility. It should be used to assess the anthropometrically-measured myocardial infarction risk, once the imbalances between measurements and association biases are recognised.

Yunfu Lv1, Jinfang Zheng1, Jincai Wu1, Zhensheng Zhang1

1Hepatobiliary surgery center, Hainan General Hospital, Hainan Medical College Affiliated People’s Hospital, Haikou 570311, China.

Abstract

Objective: To explore the etiology, diagnosis and treatment of non-foreign body secondary choledochectasis.

Methods: The clinical data of 162 cases admitted from January 1994 to December 2021 were retrospectively studied.

Results: The causes and diagnosis of non-foreign body secondary dilatation were accurately identified by careful history, physical examination, imaging and laboratory tests into 10 categories: inflammation; sphincter of Oddi dysfunction; post gastric bariatric surgery; compression; compensation; Bile duct injury; duodenal disease; other factors.

Conclusion: The etiology of non-foreign body secondary choledochectasis is complex. Effective treatment should be selected according to different etiologies.

Dr. Mario Sutil-Vega1, Dr. Marcelo Rizzo1, Dr. Fadwa Taibi-Hajjami1, Dr. Carlos Roca-Guerrero1, Dr. Íngrid Colomer Asenjo1, Dr. Meritxell Lloreda Surribas1, Dr. Núria Mallofré Vila1, Dr. Gabriel Torres Ruiz1, Dr. Paola Rojas Flores1, Dr. Antoni Martínez-Rubio1

1Cardiology Department of the University Hospital of Sabadell (Autonomous University of Barcelona). Parc Taulí 1, 08208 Sabadell, Spain.

Abstract

Introduction: In patients with heart failure, global longitudinal strain (GLS) early detects decreased ventricular contractility, with prognostic value, but there is no evidence that GLS properly differentiates etiologies in patients with left ventricular ejection fraction <50%.

Methods and aims: 147 patients with heart failure and left ventricular ejection fraction <50% were included retrospectively. The aims were to compare the GLS in patients with heart failure with reduced (<40%) to those with mildly reduced ejection fraction (40-49%) and, to compare GLS between the different etiologies in each of these two subpopulations.

Results: 78 patients presented mildly reduced (53%) and 69 reduced ejection fraction (47%). The mean GLS was -13.4% ± 3.3% (mildly reduced -14.9% ± 2.9%, reduced -11.7% ± 3.0%, p <0.001). In mildly reduced ejection fraction, the etiologies were ischemic (47.4%), idiopathic (25.6%), tachycardiomyopathy (12.8%), valvular (11.6%), and toxic (2.6%), with similar mean GLS (p = ns among all etiologies). In reduced ejection fraction, the etiology of 50.7% patients was ischemic, 24.6% idiopathic, 10.1% valvular, 8.7% tachycardiomyopathy, and 5.8% toxic, with similar mean GLS (p = ns among all etiologies).

Conclusions: There were no significant differences in GLS between the etiologies of heart failure in any subpopulation. The reduced ejection fraction patients presented worse GLS.

Satheesh Pedipina, Dr Anuj Singhal, Vivek Guleria, Baby Sarvani

Abstract

Background: In the management of patients with ischemic stroke and transient ischemic attack, determining the source of the embolic event is critical. In 15–30 percent of all strokes, cardiogenic embolism is suspected to be the cause. In these patients, echocardiography is a frequently utilized and adaptable method that can provide full information on thromboembolic risk. The relevance of transthoracic-echocardiography(TTE) and transesophageal-cardiography(TEE) in clinical practice is discussed in this article, which analyses probable cardiac origins of stroke. Aim of this study is to determine the clinicoradiological association between ischemic stroke in young and find out the relevance of Transthoracic and Transesophageal Echocardiography in evaluation of cardiac source in ischemic stroke.

Methods: This cross- sectional observational study was done between July 2020 and April 2022. A total of 31 patients of ischemic stroke were included. The mean Glasgow coma scale(GCS) and National Institutes of Health Stroke Scale(NIHSS) score of these patients They underwent transthoracic-echocardiography followed by transesophageal-cardiography, their clinical profile has been recorded and analyzed using Statistical Package for the Social Sciences (SPSS), V21 software.

Results: The mean age of the subjects was 34.74± 6.38 years, the group was consisting of 28 males and 3 females. The most common complaint was one sided weakness and speech difficulty. transesophageal-cardiography detected bicuspid aortic valve in one patient, right atrial thrombus in one patient and patent foramen ovale in one patient over and above transthoracic-echocardiography findings in this study.

Conclusion: This study reinforces the importance of transesophageal-cardiography in stroke in young patients. transesophageal-cardiography is not only an effective tool in picking up the clots inside the cardiac chambers but also it was found to be significantly effective in detecting valvular lesions and small septal defects like patent foramen ovale. It is recommended that all strokes in young patients need to undergo transthoracic-echocardiography as well as transesophageal-cardiography to find the source and etiology of cardio embolic stroke.

Jie Huang1, Yi Mi2, Junxi Li3, Gary R McLean4,5

1School of Public Health and Emergency Management, Southern University of Science and Technology, Shenzhen, Guangdong, China
2Beijing No.4 High School International Campus
3Shenzhen College of International Education, Shenzhen, Guangdong, China
4School of Human Sciences, Cellular Molecular and Immunology Research Centre, London Metropolitan University
5National Heart and Lung Institute, Imperial College London, London, UK

Abstract

Coronavirus (CoV) is one of the most widely used words during the past two years. If it were announced that Delta CoV only affects animals such as pigs and wigeons while Omicron CoV does not even exist, surely people would be offended and question the credibility of whoever stated this. But both statements are true, scientifically. Of note, it was stated Delta CoV and Omicron CoV, not Delta variant or Omicron variant of SARS-CoV-2. Such potentially confusing naming of a globally important virus therefore warrants further analyses. At the subfamily level, CoVs are divided into four genera (Alpha, Beta, Gamma, Delta) and only viruses of the Alpha and Beta branch infect humans. Now that the Omicron variant of SARS-CoV-2 have taken over from the Delta variant globally, the issue of the double use of genus labels (Alpha, Beta, Gamma, Delta) for variant naming is mitigated. However, we can still pause and ponder whether the Greek symbols alone are indeed ideal for labeling waves of SARS-CoV-2 variants. Here we propose additional criteria for naming of variants that considers specific biological and molecular characteristics of the virus-cell interaction. Our aim is to define a biological and structurally defined metric that can be used to distinguish SARS-CoV-2 variants interactions with host cells. This metric could find utility with numerous human viruses and provide an additional parameter for improved naming of viruses.

David F. Wieczorek1

1Department of Molecular Genetics, Biochemistry and MicrobiologyUniversity of Cincinnati College of Medicine, 231 Albert Sabin WayCincinnati, OH45267-0524

Abstract

Numerous molecular and biochemical processes regulate protein production in the cell. One of these processes, phosphorylation, allows the cell to rapidly adapt to changing physiological situations. In terminally differentiated cells, such as cardiomyocytes, phosphorylation of sarcomeric proteins controls contraction and relaxation under both normal and stressful conditions. The focus of this review is how phosphorylation of sarcomeric proteins alters physiological performance in cardiac muscle with a particular emphasis on the thin filament protein tropomyosin. This topic is addressed by the examination of tropomyosin isoform expression and its phosphorylation state from embryonic to adult murine development. Next, studies are examined which utilize in vivo model systems to express phosphorylation mimetics and de-phosphorylation genetically-altered tropomyosin transgene constructs. Results show that tropomyosin isoform expression is highly regulated, along with its phosphorylation state. Transgenic mouse hearts which express high levels of a constitutively phosphorylated tropomyosin develop a severe dilated cardiomyopathy and die within a month. A more moderate expression of this phosphorylation mimetic leads to normal systolic performance, but impaired diastolic function. When tropomyosin is dephosphorylated, the transgenic mice develop a compensated cardiac hypertrophy without systolic or diastolic alterations. Interestingly, when dephosphorylated tropomyosin is co-expressed with a hypertrophic cardiomyopathy tropomyosin mutation, the pathological phenotype is rescued with improved cardiac function and no indices of systolic or diastolic dysfunction. These studies demonstrate the functional significance of tropomyosin phosphorylation in determining cardiac performance during both normal and pathological conditions.

David F. Wieczorek1

1Department of Molecular Genetics, Biochemistry and MicrobiologyUniversity of Cincinnati College of Medicine, 231 Albert Sabin WayCincinnati, OH45267-0524

Abstract

Advances in technology and the media have favored that knowledge can be obtained quickly and completely for all sciences; so, students can get information in this way; now you can complete professional careers “online”, therefore. Is the presence of a professor necessary to study medicine and specialties?

In this paper, the forms of learning at the different levels of professional training, objective learning (learning they should know), the objectives pursued with learning, the role of culture and humanism in medicine are analyzed; they must learn the study of the  techniques of evidence-based medicine and the clinical guidelines. The exam preparation courses, the objectives of how the learning of medical education should be structured are analyzed; how the medical attitude towards the patient should be individualized, what role does the clinical judgment play in the doctor, the terminal efficiency, all this in the university career of medicine and specialties in tertiary hospitals and in conclusion: The physical presence of the expert professor is important to carry out a tutorial teaching with direction in the academic and ethical evolution of the doctor in training throughout his career, in order to achieve an excellent clinical doctor who dominates the wonderful modern technology that he uses.

David F. Wieczorek1

1Department of Molecular Genetics, Biochemistry and MicrobiologyUniversity of Cincinnati College of Medicine, 231 Albert Sabin WayCincinnati, OH45267-0524

Abstract

Li-Fraumeni syndrome (LFS) is an autosomal dominant cancer predisposition syndrome. Germline pathogenic/likely pathogenic variants (P/LPVs) in the TP53 gene are the only known genetic cause of this entity. Due to the severe phenotype and controversy over increasing surveillance and risk-reducing measures, TP53 testing has traditionally only been offered when strict criteria were met. However, with the application of next generation sequencing (NGS) to multigene testing (MGT), such as hereditary breast cancer panels, TP53 variants are being increasingly detected. In our multidisciplinary program, 2389 TP53 molecular tests were performed between January 2000 and December 2021, resulting in the identification of 29 carriers harboring 20 different TP53 P/LPVs, including one not previously described [c.242del p.(Thr81Asnfs*42)] and another of variable penetrance [c.799C>T p.(Arg267Trp)]. Two molecular findings with low allele frequencies (LAF) required additional diagnostic workup. Family phenotypes fulfilled Chompret (n=14), classic (n=4), or none of any previously described clinical criteria (n=4). For all cancers registered, patients had a first cancer diagnosis earlier when harboring DNE_LOF (DNE_LOF: dominant negative_loss of function), notDNE_LOF, frameshift and splicing variants (p<0,05), in contrast with notDNE_notLOF and unclassified variants. Breast (either as first or subsequent diagnosis) and cancers other than sarcomas and CNS, were diagnosed earlier in patients with notDNE_LOF variants (p<0,05).

For a follow-up of 51,5 months (2-118,9), we registered 11 deaths, 9 new cancers (all in previous cancer survivors), and 6 relapses (50% sarcoma cases). Radiotherapy-associated cancer was observed in one new cancer diagnosis. One healthy male underwent preimplantation genetic testing. With this study, we reinforce the need to provide multidisciplinary programs, even for a rare patient population, to avoid clinical mismanagement.

REVIEW ARTICLES

Shunsuke Kiuchi1, Takanori Ikeda1

1Department of Cardiovascular Medicine, Toho University Graduate School of Medicine, Tokyo, Japan

Abstract

There had been no effective cardioprotective medications for heart failure with preserved ejection fraction (HFpEF). Therefore, treatment intervention at the hypertension (HT) stage (stage A), which is a major factor in HFpEF, is necessary. In fact, the SPRINT and STEP trials reported that strict and intensive blood pressure (BP) control was useful, reducing approximately 25% of the primary endpoints, including cardiovascular events. The effectiveness of BP reduction for HFpEF after the onset of HF (stage C or D) has been reported and shown to generally follow the J-curve phenomenon. Both left ventricular systolic/diastolic dysfunction and vascular failure are related with the pathophysiology of HF. In the case of coexisting vascular failure, BP lowering treatment is effective, because it decreases the afterload. However, BP lowering treatment has been reported to increase the incidence of renal dysfunction; therefore, paying attention to the degree of association with vascular failure, and multiple organs when determining the target BP are important to consider. The decision on the target BP and the optimal choice of cardioprotective/antihypertensive medications for HF should be based on the pathologic condition.

Margaret V Ragni1

1University of Pittsburgh

Abstract

This is an exciting time in hemophilia treatment with the unprecedented development of novel non-factor therapies. These agents have re-balanced hemostasis in patients with hemophilia A and B, with and without inhibitors, tipping the balance toward hemostasis and improved thrombin generation. While there have been numerous publications about the beneficial hemostatic effects and significant bleed reduction possible with these novel non-factor agents, little has been written about the less well-recognized thrombotic complications. Yet, the latter underscores the fine balance between hemostasis and thrombosis and the fact these agents prevent but do not treat bleeds, requiring clotting factor requirement to treat acute bleeds. The purpose of this Commentary is to review thrombotic complications that have occurred with non-factor therapies, risk factors for thrombosis, potential mechanisms, and potential mitigation approaches.

Saif Khan1, Shinyi Ding1, Aimee Hong1, Sunny Chen1, Abel Ketama1, Jiwang Chen1,2

1Center for Cardiovascular Research
2Division of Pulmonary and Critical Care Medicine, Sleep and Allergy, Department of Medicine, University of Illinois at Chicago, IL, USA

Abstract

The environmental exposure of Bisphenol A (BPA) is a pervasive and growing concern. BPA is a high-volume industrial chemical that possesses estrogen-like properties and functions as an environmental endocrine disruptor. It is used extensively in the production of polycarbonate plastics and epoxy resins for food and beverage packages and hygienic products. An increasing amount of plastic pollution prevalent throughout the world has resulted in the nearly ubiquitous exposure of BPA towards humans and animals alike. Concerns have surfaced accordingly, surrounding the potentially detrimental effects that might result from BPA leaching into foods and beverages. The increase in epidemiological studies related to BPA have since been able to conclude links between BPA-induced oxidative stress, cardiovascular diseases, and hypertension. This review will incorporate current literature examining BPA exposure through clinical and epidemiological trials; these trials will encompass the physiological and toxicological effects that BPA can impose on the human cardiovascular system.

Yugar-ToledoJC1, Dinamarco N2, Vilela-MartinF3, Rodrigues B4, Moreno H4

1Cardiology and Endocrinology Institute of São José do Rio Preto (ENDOCOR), São José do Rio Preto –SP, Brazil.
2Department of Cardiology, Santa Cruz State University (UESC), Ilhéus –BA, Brazil.
3Department of Internal Medicine, Rio Preto Faculty of Medicine (FAMERP), São José do Rio Preto –SP, Brazil.
4State University of Campinas (UNICAMP), Campinas –SP, Brazil.

Abstract

Cigarette smoke is a complex mixture of about 7,000 different toxic substances, many of which are generated during the burning of the tobacco leaf, some in the gas phase and others in the particulate matter. The gas phase represents approximately 60% of the smoke from the burning of tobacco, 99% of this phase is composed of nitrogen, oxygen, carbon dioxide, carbon monoxide, hydrogen, argon, and methane. Atherosclerosis associated with smoking is not necessarily an effect of nicotine, but probably the joint action of the various constituents of cigarette smoke. ROS from the gas phase of tobacco contributes to the onset and progression of atherosclerosis. Bupropion and varenicline are used for smoking cessation despite the side effects; however, we are still far from effective treatment to assist the definitive discontinuation of the habit of smoking. This review discusses the main mechanisms associated with vascular damage from the smoking.