From its discovery in the 1980s, Polymerase chain reaction was further developed and is nowadays used as the foundation for the various PCR-based techniques used in molecular diagnosis across different species, and numerous types of samples. Real-Time PCR enables the user to monitor the amplification of a deoxyribonucleic acid (DNA) or complementary DNA (cDNA) target during the PCR run, in real-time, and not at the end, as it is the case in conventional PCR. The most frequent types of applications include gene expression analysis, gene silencing, variant analysis, and fusion temperature analysis. Given its vast field of application, a key question remains, and it is related to the controls (negative controls, positive controls, internal exogenous and endogenous controls) and their purpose in a Real-Time PCR experiment. In this paper, we set out to find how and when to use them, and which type of controls are suitable for certain experiment types, since the use of appropriate controls during Real-Time PCR experiments will reduce the effects of variables aside from the independent variable within the sample, therefore yielding accurate results, be it in research or diagnostic purposes.
Ventricular septal defects (VSDs) are the most common type of heart malformation and may occur like a part of a syndrome or as an isolated form. Clinical manifestations are related to the interventricular flow, which is determined by the size of the defect. Aiming at the identification of genetic causes is important in both syndromic and non-syndromic forms of VSD, to estimate the prognosis and choose the optimal management. Other reasons of the identification of genetic factors in the etiopathogenesis include the assessment of the neurodevelopmental delay risk, recurrence in the offspring, and association with extracardiac malformations. The diagnostic process has been improved, and currently, the use of the most suitable and accessible technique in the clinical practice represents a challenge. Additional advantages in genetic testing were brought by next-generation sequencing technique, various testing panels being available in many laboratories.
Chemical peel is a dermato-cosmetic procedure used to destroy and remove, in a controlled manner and under the supervision of the specialists, the degraded parts of the skin, in order to allow acceleration of the skin regeneration process. Based on their depth of skin penetration chemical peels are classified into superficial, medium and deep peels. The substances used in the chemical peels differ from each other depending on the effective action depth. Different peel agents with an appropriate peel depth should be selected based on the problem to be treated, considering also the nature of skin pathology. To achieve the best results other factors, such as skin type and characteristics, region to be treated, safety issues, healing time, and patient adherence, should also be considered. The present review focuses on the particularities of the substances used in various peel types, highlighting recent advances in chemical peel technology and explaining suggested application of certain substances in different peel types.
Patient positioning is a crucial step in neurosurgical interventions This is the responsibility of both the neurosurgeon and the anesthesiologist.
Patient safety, surgeon’s comfort, choosing an optimal trajectory to the lesion, reducing brain tension by facilitating venous drainage, using gravitation to maintain the lesion exposed and dynamic retraction represent general rules for correct positioning. All bony prominences must be protected by silicone padding. The head can be positioned using a horseshoe headrest or three pin skull clamp, following the general principles: avoiding elevating the head above heart more than 30 degrees, avoiding turning the head to one side more than 30 degrees and maintaining 2 to 3 finger breaths between chin and sternum. Serious complications can occur if the patient is not properly positioned so this is why great care must be paid during this step of the surgical act.
Cardiovascular autonomic neuropathy is the most frequent clinical form of autonomous diabetic neuropathy and appears secondary to cardiac autonomous fibre involvement, actively involved in cardiac rhythm impairment. Type 2 diabetes mellitus patients can present cardiac autonomic neuropathy early in the disease. Autonomous nerve function in DM patients should be assessed as early as the diagnosis is set in order to establish the optimal therapeutic strategy. The most frequent cardio-vagal test used is heart rate variability. An abnormal heart rate variability in the presence of orthostatic arterial hypotension indicates a severe cardiac autonomic neuropathy diagnosis. The development of cardiac autonomic neuropathy is subjected to glycaemic control, duration of the disease and associated risk factors. The glycaemic control is extremely important, especially early in the disease. Therefore, a poor glycaemic control carries unfavourable long-term effects, despite an ulterior optimal control, a phenomenon named “hyperglycaemic memory”. In type 2 diabetes mellitus patients, the association of cardiac autonomic neuropathy with intensive glycaemic control increases the mortality rate, due to the fact, that, secondary to autonomous impairment, the patients do not present the typical symptoms associated with hypoglycaemia. Stratifying the cardiac autonomic neuropathy aids the clinician in assessing the morbidity and mortality risk of diabetes mellitus patients, because it is an independent risk factor for mortality, associated with silent myocardial infarctions and the risk of sudden death.
Spleen-derived immune cells are considered to play central role in the progression of ischemic brain damage contributing to both the local and systemic inflammatory response initiated by an ischemic insult in the brain tissue. Brain-spleen communication in acute ischemic brain injury has been studied especially in rodent models of stroke, which mimic the acute focal brain ischemia in humans. Rodent spleens decrease in size after experimentally induced stroke, due mainly by the release of spleen`s immune-cells into the circulation. Splenectomy prior to middle cerebral artery occlusion is protective to the ischemic brain resulting in decreased infarct volume and reduced neuroinflammation. Various therapeutic strategies in clinical use aiming to protect the neural tissue after stroke were found to involve the modulation of splenic activity, altogether indicating that the spleen might be a potential target for therapy in ischemic brain injury. Importantly, the most clinical studies demonstrated that the splenic response in stroke patients is similar to the changes seen in rodent models. Thus, despite the limitations to extrapolate the results of animal experiments to humans, rodent models of stroke represent an important tool for the study and understanding of brain-spleen communication in the pathogenesis of acute brain ischemia.
Over the past years, prevention and control of risk factors has begun to play an important role in the management of patients prone to develop atrial fibrillation (AF). A considerable number of risk factors that contribute to the creation of a predisposing substrate for AF has been identified over the years. Although certain AF risk factors such as age, gender, genetic predisposition, or race are unmodifiable, controlling modifiable risk factors may represent an invaluable tool in the management of AF patients. In the recent decades, numerous studies have evaluated the mechanisms linking different risk factors to AF, but the exact degree of atrial remodeling induced by each factor remains unknown. Elucidating these mechanisms is essential for initiating personalized therapies in patients prone to develop AF. The present review aims to provide an overview of the most relevant modifiable risk factors involved in AF occurrence, with a focus on the mechanisms by which these factors lead to AF initiation and perpetuation.
Heart failure still represents a real challenge both in everyday practice and research, due to the complex issues related to its pathogenesis and management. Humoral biomarkers have emerged in the last decades as useful tools in the diagnosis, risk stratification and guiding the treatment of heart failure. These molecules are related to different pathological and adaptive processes, like myocardial injury, neurohormonal activation and cardiac remodeling, their most widespread representatives being the natriuretic peptides (e.g. NT-proBNP). The role of altered gene expression and transcription as the basis of myocardial structural and functional changes in heart failure is largely recognized. MicroRNAs (miRNAs) are non-coding RNAs which have a major role in post-transcriptional gene expression by interfering with messenger RNA molecules. Our short review summarizes the molecular biology of miRNAs and their possible role as biomarkers in the diagnosis and prognosis of heart failure. Furthermore, the therapeutical perspectives conferred by these molecules are also presented.
Keywords: miRNA, biomarkers, heart failure
Quality by Design is the methodical method to development concept that starts with the predefined objects. The method put emphasis on the process of development of a product, the control process, which is built on risk management and comprehensive knowledge of science. The concept of QbD applied to analytical method development is known now as AQbD (Analytical Quality by Design). Comprehension of the Analytical Target Profile (ATP) and the risk assessment for the variables that can have an impact on the productivity of the developed analytical method can be the main principles of the AQbD. Inside the method operable design region (MODR), the AQbD permits the movements of the analytical methods. This paper has been produced to discuss various views of analytical scientists, the comparison with conventional methods, and the phases of the analytical techniques.
The article highlights the fact that public health is an element of the security dimension that must be included on the priority agenda of specialists in the fields of international relations and security studies. There are arguments in favor of this theory. The costs of materializing threats to human security in general and public health, in particular, are particularly high, with serious long-term consequences. Global trends and prospects for the implications that can be generated are likely to change the world’s security landscape, and increasing global connectivity increases the degree of uncertainty about public health implications. Non-traditional issues arising from technological change can induce risks, whose management may go beyond institutional capacities. On the other hand, the new types of wars, increasingly interconnected with various forms of risk materialization, make this mission more difficult. The final conclusion is that these risks need to be assessed to ensure national, regional or even global security, and international cooperation for prevention and counseling.