AUTHOREA
Log in Sign Up Browse Preprints
LOG IN SIGN UP

Preprints

Explore 66,105 preprints on the Authorea Preprint Repository

A preprint on Authorea can be a complete scientific manuscript submitted to a journal, an essay, a whitepaper, or a blog post. Preprints on Authorea can contain datasets, code, figures, interactive visualizations and computational notebooks.
Read more about preprints.

The associated risk factors for the diagnosis of asthma in the preschool age children...
Fatih Kaplan
Erdem Topal

Fatih Kaplan

and 1 more

February 18, 2022
Background: Chronic nonspecific cough is one of the important reasons for doctor visits in childhood. The aim of this study is to determine the incidence of asthma diagnosis in preschool children followed up with the diagnosis of chronic non-specific cough and the factors associated with the diagnosis of asthma. Methods: The files of preschool children who were followed up with the diagnosis of chronic non-specific cough were retrospectively reviewed. Information such as demographic data of the patients, treatments given for chronic non-specific cough, cough scores during the follow-up etc. was recorded from the medical files. The patients who were diagnosed with asthma during the follow-up were included in the Group 1, and the other patients were included in the Group 2. Results: A total of 226 patients were enrolled in the study. Of the patients with the mean age of 4 years (min:1-max:6), 102 (45.1%) were female. The median follow-up time of the patients was 30 months (min:30-max:34). No significant difference was found between the two groups in terms of age, sex and follow-up time. During the follow-up, 137 patients (60.6%) were diagnosed with asthma. In our study, the asthma-associated factors in follow up were determined to be parental asthma, sibling asthma, aeroallergen sensitivity, elevated total serum IgE, eosinophilia and low-income level. Conclusions: More than half of preschool age children with chronic nonspecific cough receive the diagnosis of asthma. Therefore, patients with risk factors such as parental asthma, sibling asthma, aeroallergen sensitivity, elevated serum total IgE, eosinophilia and low-income level should be followed up for asthma.
Alterations of intestinal fora and the effects of probiotics in children with recurre...
Ying Zhu
Weizhuo Wang

Ying Zhu

and 2 more

February 17, 2022
Recurrent respiratory tract infection(RRTI) are one of the major health problems in children. Different degrees of immune deficiency or immaturity, as well as specific or non-specific immune dysfunction are the main causes of RRTI in children with RRTI. Immunomodulators are the most commonly used treatment. Probiotics are viable bacteria that colonize the intestine and affect the host intestinal microbial balance. Probiotics can participate in local and systemic immune regulation in a variety of ways. The aim of this review was to assess the specific impact of probiotics use on recurrent respiratory tract infection in children, the ability of live bacteria to alter intestinal microbial populations and to exert subsequent benefits for the host.
Case Report: Severe brain hypoxic damages after acute methanol poisoning
Ameni Abidi
Asma Souid

Ameni Abidi

and 6 more

February 17, 2022
Methanol poisoning is a challenging clinical situation with irreversible neurologic complication mainly encountered in developed countries. We report a case of a 50-year-old patient who presented with methanol poisoning, symptomatic of respiratory and neurologic failure. In this context, cerebral magnetic resonance imaging concluded entangled injury mechanisms leading to neurologic failure.
Exploring the causes of flow attenuation at a beaver dam sequence.
Hugh Graham
Alan Puttock

Hugh Graham

and 4 more

February 17, 2022
Beavers influence hydrology by constructing woody dams. Using a before after control impact experimental design, we quantified the effects of a beaver dam sequence on the flow regime of a stream in SW England. Building upon our previous research (Puttock et al., 2021), we consider the mechanisms that underpin flow attenuation in beaver wetlands. Rainfall-driven hydrological events were extracted between 2009 and 2020, for the impacted (n=612) and control (n=634) catchments, capturing events seven years before and three years after beaver occupancy, at the impacted site. General additive models were used to describe average hydrograph geometry across all events. After beaver occupancy, Lag times increased by 55.9% and declined by 17.5% in impacted and control catchments, respectively. Flow duration curve analysis showed a larger reduction in frequency of high flows, following beaver dam construction, with declines of Q5 exceedance levels of 33% and 15% for impact and control catchments, respectively. Using event total rainfall to predict peak flow, five generalised linear models were fitted to test the hypothesis that beaver dams attenuate flow, to a greater degree, with larger storm magnitude. The best performing model showed we can have high confidence that beaver dams attenuated peak flows, with increasing magnitude, up to between 0.5-2.5 m 3 s -1 for the 94 th percentile of event total rainfall; but we cannot confidently detect attenuation beyond the 97 th percentile. Increasing flow attenuation, with event magnitude, is attributed to transient floodplain storage in low gradient/profile floodplain valleys. These findings support the assertion that beaver dams restore attenuated flows. However, with long-term datasets of extreme hydrological events lacking, it is challenging to predict the effect of beaver dams during extreme events with high precision. Beaver dams will have spatially variable impacts on hydrological processes, requiring further investigation to quantify responses to dams across differing landscapes and scales.
The fate of the Dead Sea: total disappearance or a hypersaline lagoon
Ibrahim Oroud

Ibrahim Oroud

March 28, 2023
An energy balance model run on a monthly time step for 800 years was developed to predict the future level and areal extent of the Dead Sea under different scenarios of freshwater input and boundary conditions. The model integrates energy, water and salt balances. The bathymetry of the Dead Sea was obtained from digital elevation data derived from a high-resolution contour map. Model results were verified against measured lake level for the period 1928 through 2022. Predicted levels are very close to observed values as demonstrated by three statistical measures. The monthly temperatures of the mixed layer as predicted by the model were also commensurate with observational results and atmospherically corrected surface temperatures retrieved from the thermal band onboard Landsat 8.  Simulation results show that the Dead Sea level will range from ~ 510 m below sea level (m bsl) when freshwater input is ⁓400 × 106 m3 a-1 to 585 m bsl when precipitation is the only freshwater source reaching the lake. The time span needed for the Dead Sea to reach a quasi-steady state equilibrium, that is when consecutive simulated annual cycles are identical- is several hundred years.
Glacier hydrological process modeling based on improved SWAT+: A case study in the Up...
Chengde Yang
Min Xu

Chengde Yang

and 4 more

February 17, 2022
Glaciers have proven to be a particularly sensitive indicator of climate change, and the impact of glacier melting on downstream water supplies is becoming increasingly important as the world’s population expands and global warming continues. Data scarcity in mountainous catchments, on the other hand, has been a substantial impediment to hydrological process simulation. Therefore, an integrated glacier hydrological process module was introduced for the Soil and Water Assessment Tool Plus model (SWAT+), in which an enhanced temperature-index glacier melt algorithm considering solar radiation was employed to maintain model clarity and favorable performance in this study. Furthermore, SWATplusR was introduced for sensitivity analysis using the Sobol approach, and Integrated Parameter Estimation and Uncertainty Analysis Tool Plus (IPEAT+) was coupled with this enhanced model (SWAT+Glacier) to perform calibration and validation in the Upper Yarkant River (UYR) basin. The result indicated that (i) including glacial-hydrological processes considerably improved simulation precision, with an NSE promotion of 2.6 times and R 2 of 1.7 times greater than the original model; (ii) it is an efficient and feasible way to simulate glacial-hydrological processes with SWAT+Glacier and calibrate it using observed discharge data in data-scarce and glacier melt dominated catchments; and (iii) we discovered that glacier runoff is intensively distributed throughout the summer season, accounts for about 78.5% of the annual glacier runoff, and glacier meltwater provides approximately 52.5% (4.4×10 9m 3) of total runoff in the study area.
Recent patents in allergy and immunology: A qRT-PCR method for diagnosing asthma and...
An-Soo Jang
Pureun-Haneul Lee

An-Soo Jang

and 3 more

February 17, 2022
Recent patents in allergy and immunology: A qRT-PCR method for diagnosing asthma and asthma exacerbation Pureun-Haneul Lee, SeonMuk Choi, MinHyeok An, An-Soo JangDivision of Allergy and Respiratory Medicine, Department of Internal Medicine, Soonchunhyang University Bucheon Hospital, 170 Jomaru-ro, Wonmi-gu, Bucheon, 14584, Korea.
Comparison of asthma control before and during the SARS-CoV-2 pandemic in children 3-...
Shah Mahrukh
Mohammed Alsabri

Shah Mahrukh

and 7 more

February 17, 2022
OBJECTIVE: To determine asthma control during the SARS-CoV-2 pandemic in a minority pediatric population at a community hospital. BACKGROUND: During the pandemic, exposure to allergens and infectious agents has decreased during the pandemic due to heightened hygiene measures and primarily virtual visits. We examined the effect of the pandemic on pediatric asthma control. DESIGN/METHODS: Our study included 104 asthmatic children, 3-18 years of age. The mean age of patients was 9.7 ± 3.8. Subjects were assessed during the time period of March-August 2019 and 2020. Outcome variables included: rescue albuterol and systemic steroid use, physician visits (PMDv), emergency department visits (EDv), hospitalizations (H) for asthma exacerbation, pulmonology clinic visits, change in and adherence to controller therapy, spacer technique and BMI. RESULTS: During the pandemic, the majority of pulmonology clinic visits were via telemedicine. There was a significant difference in appropriate spacer technique and change in controller regimen, with improved technique and decreased requirement for step up controller therapy, during the pandemic There was no significant difference in BMI, adherence to controller therapy or the number of pulmonology visits. Additionally, there was an improvement in asthma control during the pandemic with less use of rescue albuterol, systemic steroids, H, EDv or PMDv for acute exacerbation. CONCLUSIONS: Overall, we found that asthma control has improved during the SARS-CoV-2 pandemic in terms of: reduced albuterol and systemic steroid use and decreased asthma exacerbations. The recent addition of telemedicine to patient care has not negatively affected asthma control in children.
Development of a novel high resolution melting assay for identification and different...
Simone Scherrer
Sophie Peterhans

Simone Scherrer

and 6 more

February 17, 2022
Actinobacillus pleuropneumoniae is the etiological agent of porcine pleuropneumonia, a respiratory infectious disease responsible for global economic losses in the pig industry. From a monitoring perspective as well as due to the different courses of disease associated with the various serovars, it is essential to distinguish them in different herds or countries. In this study, we developed a novel high resolution melting (HRM) assay based on reference strains for each of the 19 known serovars and additional 15 clinical A. pleuropneumoniae isolates. The novel HRM comprises the species-specific APP-HRM1 and two serovar-specific HRM assays (APP-HRM2 and APP-HRM3). APP-HRM1 allowed PCR amplification of apxIV resulting in an A. pleuropneumoniae specific melting curve, while nadV specific primers differentiated biovar 2 from biovar 1 isolates. Using APP-HRM2 and APP-HRM3, 13 A. pleuropneumoniae serovars can be determined by inspecting the assigned melting temperature. In contrast, serovar 3 and 14, serovar 9 and 11, and serovar 5 and 15 have partly overlapping melting temperatures and thus represent a challenge to accurately distinguish them. Consequently, to unambiguously ensure the correct assignment of the serovar, it is recommended to perform the serotyping HRM assay using a positive control for each serovar. This rapid and user-friendly assay showed high sensitivity with 1.25 fg - 125 pg of input DNA and a specificity of 100% to identify A. pleuropneumoniae. Characteristic melting patterns of amplicons might allow detecting new serovars. The novel HRM assay has the potential to be implemented in diagnostic laboratories for better surveillance of this pathogen.
“Mothers know best”: Female primates promote proximity among their offsprings’ patern...
Marie Charpentier
Clémence Poirotte

Marie Charpentier

and 7 more

February 17, 2022
Kin discrimination is a key process structuring social relationships in animals. We show how it may be generalized to entail discrimination towards non-kin, and provide a first example of this process in a primate. In mandrills, we recently demonstrated increased facial resemblance among paternally-related females indicating adaptive opportunities for paternal kin recognition. Here, we argue that mothers use offspring’s facial resemblance with other infants to guide offspring’s social opportunities. Using deep learning for face recognition combined with long-term field observations, we demonstrate that mothers are spatially closer to infants that resemble their own offspring more, facilitating associations among similar-looking infants. Using theoretical modeling, we describe a plausible evolutionary process whereby mothers gain fitness benefits by promoting nepotism among paternally related infants. This novel mechanism, that we call “second-order kin selection”, may extend beyond mother-infant interactions and has the potential to explain cooperative behaviors among non-kin in social species, including humans.
Simultaneous Determination of 2- and 3-MCPD Esters in Infant Formula Milk Powder and...
Huan Yao
Siqi Zeng

Huan Yao

and 4 more

February 17, 2022
Esters of 2- and 3-monochloropropane-1,2-diol (MCPD) are vital contaminants of processed edible oils used as foods or food ingredients. The aim of this study was to develop and validatea new method by GC-MS for the simultaneous quantification of 2-MCPD, 3-MCPD in infant milk powder and edible vegetable oils. The developed protocol included fat fraction in infant milk powder and edible vegetable oils samples was extracted and treated with sodium methylate–methanol to cleave the ester bonds of the 2- and 3-MCPD esters, moreover, standard samples of deuterium isotope–labeled 2- and 3-MCPD palmitic acid double esters and stearic acid double esters were used as the internal standards. Furthermore, this method was validated when it was applied to food products, concrete manifestation in its good accuracy (the recovery of MCPD esters ranged from 86% to 114%), high sensitivity (the LOD of 3-MCPD and 2-MCPD esters were 0.025 and 0.020 mg/kg, LOQ were 0.075, 0.060 mg/kg, respectively) and satisfactory repeatability (RSD below 6.8%) for all analytes. In the 150 commercial edible vegetable oils and infant formula milk powder samples, we obtained a preliminary profile of MCPD ester contamination.
Comprehensive Folding Variations for Protein Folding
Jiaan Yang
Wen Xiang

Jiaan Yang

and 10 more

February 16, 2022
The revelation of protein folding is a challenging subject in both discovery and description. Except acquirement of accurate 3D structure for protein stable state, another big hurdle is how to discover structural flexibility for protein innate character. Even if a huge number of flexible conformations are known, difficulty is how to describe these conformations. A novel approach, protein structure fingerprint, has been developed to expose the comprehensive local folding variations, and then construct folding conformations for entire protein. The backbone of 5 amino acid residues was identified as a universal folden, and then a set of Protein Folding Shape Code (PFSC) was derived for completely covering folding space in alphabetic description. Sequentially, a database was created to collect all possible folding shapes of local folding variations for all permutation of 5 amino acids. Successively, Protein Folding Variation Matrix (PFVM) assembled all possible local folding variations along sequence for a protein, which possesses several prominent features. First, it showed the fluctuation with certain folding patterns along sequence which revealed how the protein folding was related the order of amino acids in sequence. Second, all folding variations for an entire protein can be simultaneously apprehended at a glance within PFVM. Third, all conformations can be determined by local folding variations from PFVM, so total number of conformations is no longer ambiguous for any protein. Finally, the most possible folding conformation and its 3D structure can be acquired according PFVM for protein structure prediction. Therefore, the protein structure fingerprint approach provides a significant means for investigation of protein folding problem.
Diagnosis of fetal congenitally unguarded tricuspid valve orifice by echocardiography
Hailan Liu
Gaole Yuan

Hailan Liu

and 5 more

February 16, 2022
[Abstract] Objective To review the imaging characteristics and evaluate the diagnostic value of echocardiography for fetal congenitally unguarded tricuspid valve orifice (CUTVO). Methods Doppler echocardiography was performed and the images were compared with operative and necropsy findings in ten fetuses with CUTVO. The aim of the study was to summarize the characteristics of fetal echocardiography and analyze the causes of missed diagnoses and misdiagnoses.  Results There were six cases with complete absence and four cases with partial absence of the tricuspid leaflet. In seven of ten cases the pregnancy was terminated. In six cases CUTVO was confirmed by autopsy after induced labor, while one case had no autopsy. After birth, one case died due to severe illness. The two remaining cases survived with an atrial septal defect and patent ductus arteriosus on postpartum ultrasonic scans. These cases underwent surgical treatment resulting in less moderate tricuspid regurgitation. Among all cases, four were misdiagnosed and diagnosis for CUVTO missed, but CUVTO was demonstrated after induced labor. CUTVO ultrasonographic characteristics consist of the atrioventricular connection with normal arteries and the tricuspid valve device partially or completely absent. The annulus of the tricuspid valve can be describe as “empty” in the apical 4-chamber view, Doppler evaluation shows to-and-fro flow across the tricuspid orifice with low velocity and two-way spectrum. Conclusion diagnosis and differential diagnosis of CUTVO by fetal echocardiography has important clinical value.
Microbial approaches for the assessment of toothpaste efficacy against oral species -...
Pune Paqué
Lamprini Karygianni

Pune Paqué

and 7 more

February 16, 2022
Antibacterial properties of toothpastes enable chemical plaque control in limited-access tooth regions that are mechanically not sufficiently reached by toothbrushes. Therefore, this study aimed to compare different microbial methods to assess antimicrobial toothpaste properties and evaluate different toothpastes in terms of their antibacterial efficacy against different oral microorganisms in vitro. Six toothpaste suspensions with varying antibacterial supplements were applied to a multi-species biofilm model (Actinomyces oris, Candida albicans, Fusobacterium nucleatum, Streptococcus oralis, Streptococcus mutans) as well as to each microorganism. A culture method was used to assess the anti-biofilm effects and two different agar diffusion assays were performed for testing the antimicrobial effect on each microorganism. The measurements of the culture and diffusion analyses were statistically normalized and compared and toothpastes were ranked according to their antimicrobial efficacy. The results of both agar diffusion assays showed a high correlation across all tested species (Spearman correlation coefficients ρs > 0.95). The results of the multi-species biofilm model, however, substantially differed in its assessment of antibacterial properties (ρs ranging from 0.22 to 0.87), compared to the results of both diffusion assays. Toothpastes with amine fluoride with and without stannous fluoride, and triclosan resulted in the highest antimicrobial efficacy, while activated carbon supplements were comparable to the negative control NaCl. The appropriate selection of a broad range of oral microorganisms seems crucial when testing the chemical impact of toothpastes and toothpaste supplements.
Cross clinical-experimental-computational qualification of in silico drug trials on h...
Cristian Trovato
Marcel Mohr

Cristian Trovato

and 4 more

February 16, 2022
Background and Purpose Preclinical identification and understanding of drug-induced cardiotoxicity is still a major challenge. The ICH S7B Q&A promote human in silico drug trials for proarrhythmia risk assessment. However, additional evidence is needed to support further regulatory impact and for their integration in the current preclinical assessment pipelines. This study aims to provide a comparative evaluation of drug-induced electrophysiological effects on in silico and in vitro cardiac Purkinje and to assess the accuracy of these models for clinical risk predictions. Experimental Approach The effects of 14 reference compounds were quantified in a population of in silico human cardiac Purkinje models, and compared with results obtained in in vitro rabbit Purkinje preparations. For each drug dose, five electrophysiological biomarkers were quantified at three pacing frequencies, and results compared with clinical proarrhythmia reports. Key Results i) In silico, repolarisation abnormalities in human Purkinje simulations predicted drug-induced arrhythmia for all risky compounds, showing higher predicted accuracy than rabbit experiments; ii) Drug-induced electrophysiological changes observed in human-based simulations showed a high degree of consistency with in vitro rabbit recordings at all pacing frequencies, and depolarisation velocity and action potential duration were the most consistent biomarkers; iii) discrepancies observed for dofetilide, sotalol and terfenadine are mainly caused by species differences between humans and rabbit. . Conclusion and Implications In this study we showed the high degree of consistency and higher accuracy of in silico methods compared to in vitro animal models, demonstrating the high regulatory impact of in silico trials for proarrhythmia prediction.
Features of a dream programming language: 2nd draft
Magne Matre Gåsland

Magne Matre Gåsland

February 17, 2022
Since I improved the original article quite a bit, I saved the original for posterity. To respect the contents of the original URL (where it may have been shared), and to respect the ensuing Hacker News discussion based on the original article. The original article received some encouraging feedback from Ruby's creator Matz, and positive feedback and excellent critique from Roc's creator Richard Feldman. The article is a long read, so I suggest first skimming it and dipping into the sections where you find interest or disagreement. Then, if your curiosity is piqued by the ideas, I hope you will take time read it carefully over a few evenings, mull over the ideas presented (a few are novel), consider your own dream language features, and contribute those, as well as your own insights and experience, in the comments below. TL;DR / SUMMARY: I long for a very constrained language for web-app + systems development, prioritizing readability (left-right and top-down) and reason-ability above all, which is designed for fast onboarding of complete beginners (as opposed to catering to a specific language community who already have the curse of expertise). The most important familiar features it should have: - Functional Programming, but based on data-first piping. - Immutability, but w/ opportunistic in-place mutation. - Gradually typed: dynamic for development, static for production (w/ full & sound type inference). - CSP for concurrency (goroutines). - Ecosystem: interoperable with existing languages. - Transpiles to JS or compiles to WASM. The most important esoteric features it should have: - Crucial evolvability / backward- and forward-compatibility. - Content-addressable code. - Transparent upgrades without any breaking changes. - Data First Functional Programming w/ SVO-syntax. - Interpreted, for development. But compiled, incrementally, for production. - Interactive: facilitates an IDE-plugin (VS Code) that shows the contents of data structures while coding. Enable REPL'ing into a live system. - Aggressively parallelizable. Inspired by Verilog and Golang. - Scales transparently from single CPU to multi-core to distributed services, without needing refactors. The complete set of desirable features are detailed below. FIRST, A FEW OVERARCHING GUIDING PRINCIPLES: - ALL PROGRAMMING LANGUAGES ARE, IN ACTUALITY, BUILT TO OVERCOME _HUMAN_ LIMITATIONS. Otherwise, one might as well be typing 0's and 1's, or a lower-level language like Assembly. - Software architecture in general (and frameworks in specific) is _a way to organize the mind_ of the developer(s), categorising the conceptual world into what's closely or merely remotely related (giving rise to principles like 'cohesion over coupling' etc.). (This might explain OOP's popularity.) The machine would be perfectly content with executing even spaghetti code. Inspired by Martin Fowler. - A programming language has certain affordances, allowing you to talk specifically about/with some concepts (typically the first-class citizens of the language), and avoid having to talk about other things (e.g. memory management, language runtime concerns). This does not only apply to DSL's. - "each programming language has a tendency to create a certain mind set in its programmers. ... you tend to have a mental model of how to do things based on that language. ... Such a mind set may make it difficult to conceive of solutions outside of the model defined by the language." - Dennis J. Frailey - Only by accounting for human limitations (like cognitive capacity, and familiarity), could one derive a specification for the ideal programming language. - A BUG IS AN ERROR IN THINKING. Either by the developer, or the language-designer for not sufficiently accounting for human psychology (Sapir-Whorf: the language you write/speak determine what you can/do think). Even Dijkstra himself equates programming with thinking. Programming does not simply _require_ thinking, but _structure_ thinking. - To reduce bugs, a language should ensure SIMPLE, SAFE, AND SCALABLE _WAYS OF THINKING_. For instance: - Type systems are a way to use the compiler to help us verify our beliefs about our own code: they help us think consistently. - Closures, enables the programmer to specify and share behaviors that are already half-way thought through (i.e. already set up with some external data). - Transducers, allows the programmer to define and compose behaviors/processes without having to think explicitly about the particular thing which is behaving. - Currying, allows the programmer to take one grand behavior and break it down into smaller behaviors that can be reused independently, or used in sequence. - Composition allows the programmer to think & build piece by piece, instead of all at once, and without the context influencing too much. - "Open for extension, closed for modification": A programmer can recognize something useful, and add more pieces to it, without having to change the original thing (e.g. extension methods in C#), and without tying those new pieces too closely with the original thing (e.g. subclassing), thereby limiting their reuse. - A language determines WHAT you can & have to think about, but also HOW you have to think about it. - "THINGS THAT ARE DIFFERENT SHOULD LOOK DIFFERENT". Counter-inspired by D. Inspired by Lary Wall on Perl's postmodernism and my own frustrations with modern component frameworks like React, and my impression that Lisp/Clojure is perceived as hard to learn because it has so little syntax: when everything looks the same it is hard to tell things apart. SYNTAX MATTERS, so when people balk at noise like too many parentheses, you ought to listen (like React did), rather than ignore it (like Clojure did and still does; even though the Lisp inventor McCarthy didn't even like the S-expressions syntax himself), in order to remove friction from onboarding and growth. It can be the barrier between mass adoption and remaining niche. A language should optimize for mass adoption, due to network effects, since then _everyone wins_: learning, communication, portability, ecosystem etc. It doesn't mean pleasing everyone's ultimate desires, just avoid having most people turn at the door. Counter-inspired by Lisp/Clojure, and Haskell. Although it is prize-worthy to STAY VERY FRUGAL WITH SYNTAX, since more syntax necessitates more learning/documentation (knowledge debt, info overload), more avenues for confusion (the best code is no code), and more complications (language intricacies can lead to software intricacies, which can lead to bugs). My philosophy leads more towards Golang (less features, readability is reliability, simplicity scales best) and Python ("explicit over implicit", "one way over multiple ways"), than Ruby (provide sharp knives) and Perl (postmodern plurality, coolness/easiness is justification enough in itself, aka. the smell of a toy language). Even though I come from Ruby and love it, and also cannot help admiring Lisp for its elegance and crucial evolvability. - Programming should be fun and painless. Inspired by Ruby. PURPOSE: WHAT SHOULD THIS DREAM LANGUAGE OF MINE PRIMARILY BE FOR? - Webapp / app + systems development. In Rich Hickey's words: Information-driven situated programs. Ideally, also open to extension into more areas of programming. - Scripting and prototyping, but also scalable to production use (app/webapp) - Systems development (compiled) I believe there is enough cross-over between web app (in-browser + web server) and systems development that a turing-complete language could address both successfully. The language Go / Golang tries to do that (for web servers and systems), for instance. Though the question is up for debate whether or not it is a good idea, or if we should always have specialized languages for those domains. So, from various sources of inspiration, and the aforementioned principles in mind, here is the list of features that my dream programming language would have. FEATURES: Features in BOLD are considered most important. - READABILITY AND REASONABILITY as top priority. Reduce dev mind cycles > reduce CPU cycles. Human-oriented and DX-oriented. Willing to sacrifice some performance, but not much, and not to overly gain comparability with natural language (counter-inspired by SQL; inspired by Cypher). Willing to sacrifice immediate power in the language itself, esp. if that can be achieved through abstracted-away libraries. - Should always be able to be read top-to-bottom, left-to-right. No <expression> if <condititon> like Ruby allows. Certainly no <code_block> while <condition>. - Syntax matters: Readability should not imply a one to one match with natural language (counter-inspired by SQL), since natural language is inconsistent, duplicitous, ambivalent and multi-faceted. Consistency is key to a programming language. But it should borrow some similarities from natural language (like its popular Subject-Verb-Object, SVO, structure; see also DFFP) to make adoption easier (more at-hand/intuitive). The SVO syntax also aligns elegantly with the Graph Data model of RDF (subjec-predicate-object triples), and if code-is-data (homoiconicity or pseudo-homoiconicity is preserved) it could be interesting to have the code map well to a graph database, opening up avenues for analysis in the form of advanced graph algorithms, which could be useful for, say, code complexity analysis (e.g. more straightforward cyclomatic complexity analysis). Homoiconicity (code structure mirroring a data structure) could potentially also help with respect to typing, since we want to be able to execute the same code at compile-time (statics) and run-time (dynamics), to avoid the biformity and inconcistency of static languages: _"the ideal linguistic abstraction is both static and dynamic; however, it is still a single concept and not two logically similar concepts but with different interfaces."_ To avoid duplication. This is also inspired by NexusJS and io-ts: _"The advantage of using io-ts to define the runtime type is that we can validate the type at runtime, and we can also extract the corresponding static type, so we don’t have to define it twice."_ - ISOLATION / ENCAPSULATION. To analyze a program (i.e. break it down and understand it in detail), you need to be able to understand parts of the program in isolation. So everything should be able to be encapsulated (all code, whether on back-end and front-end), since ENCAPSULATION AFFORDS REASONABILITY (and testability), by limiting places bugs (i.e. errors in thinking) can hide. Counter-inspired by Rails views (sharing a global scope) and instance variables. Inspired by the testability of pure functions. - No need to manipulate data structures in the human mind. Programmer should always be able to _SEE_ THE DATA STRUCTURE he/she is working on, at any given time, in the code. Inspired by Bret Victor, and Smalltalk. Ideally with example data, not only the data type. Also, it should be possible to visualise/animate an algorithm. Since "An algorithm has to be seen to be believed", as D.E. Knuth said. It shouldn't be necessary for the programmer to take the effort to visualize it in his mind (with the error-proneness that entails). So the language should make such visualization and code-augmentation easy for tooling to support. But without being a whole isolated universe in its own right like a VM or an isolated image. Counter-inspired by Smalltalk. Some have described this as REPL-driven-development, or interactive-programming. Especially good for debugging: getting the exact program state from loading up an image of it that someone sent you. Inspired by Clojure. But with the ability to see the content of the data structures within your IDE. Inspired by QuokkaJS. The REPL-driven development approach should ideally afford simply changing code in the code editor, detecting the change, and showing the result then and there, without you having to go to back-and-forth to a separate REPL-shell and copy-pasting / retyping code. Inspired by ClojureScript. In fact, since a program is about binding values to the symbols in your code, when running your code, the IDE, enabled by the content-addressable code feature), could replace variables in the text with their bound values, successively. Effectively animating the flow of data through your code. Without you having to go to an external context like a debug window to see the bindings. - Params: Function parameters must be named, but no need to repeat yourself, if the argument is named the same as the parameter (i.e. keyword arguments can be omitted). Inspired by JS object params, and Ruby. Counter-inspired by the Mysterious Tuple Problem in Lisp, and inspired by the labeled arguments in ReasonML, and the interleaving of keywords and arguments in Smalltalk. If currying, then input params should be explicit at every step (for clarity, refactorability and to aid the compiler). Counter-inspired by Point-free style in FP (since "explicit is better than implicit", inspired by Python). Should probably not auto-curry functions, since "it makes default, optional, and keyword parameters difficult to implement". - No Place-oriented programming (PLOP), iow. avoid order-dependence at almost any cost, since it isn't adaptable/scalable. Inspired by Clojure. This goes for reorderability of expressions due to pure functions having no side-effects. Such reordering is desired (see: "4.2 Complexity caused by Control", Out of the Tar Pit, 2006) since it allows structuring programs in a finish-to-start/high-to-low-level manner, enabling the reader to incrementally drill down into the code with the underlying implementation (the same reason that JS has function hoisting). Order-independence also goes for parameter lists to functions. I don't want to have to use a _ placeholder for places where there _could_ be a parameter, just because I didn't supply one. Shouldn't have to sacrifice piping just to get named arguments, either (piping should use an explicit pipe operator). Counter-inspired by Elm, and inspired by Hack. Consequence (?): would need a data structure like a record but which ideally can be accessed in an order-independent manner (similar to a map). Plus, functions should be able to take in such records but also a rest parameter that can represent an arbitrary number of extra fields (to make functions more reusable and less coupled to their initial context, e.g. they should be able to be moved up/down in a component hierarchy without major changes to their parameter lists). Counter-inspired by Elm. - No unless or other counter-intuitive-prone operators. Counter-inspired by Ruby. - No abstract mathematical jargon. Counter-inspired by Haskell. As it impedes onboardig, and induces a mindset of theorizing and premature abstraction/generalization that impedes rapid development. Should be accessible for as wide a community as possible, with as little foreknowledge as possible. Inspired by Quorum. - Do not presume contextual knowledge. In UX this is known as "No modes!". Code should be able to be read A to B without having been educated/preloaded with any foreknowledge (like 'in this context, you have these things implicitly available'). Counter-inspired by class inheritance and Ruby magic, and JavaScript's runtime bound this keyword and associated scoping problems. Turns out too much dynamism (runtime contextualisation) can be harmful. Counter-inspired by JavaScript: _"In JS, this is one truly awful part. this is a dynamically scoped variable that takes on values depending on how the current function was invoked."_ -- sinelaw. - Should facilitate and nudge programming in the language towards code with low Cognitive Complexity score. - DYNAMIC VERBOSENESS: Should be able to show more/less of syntax terms in the code (think: length of variable names). Beginners will want more self-documenting code. Whereas experts typically desire terser code, so they can focus on their the problem domain without clutter from the language (e.g. mathematics). A programmer will typically gradually go from beginner to expert on a codebase, even his own. See: content-addressable code. Content-addressable code would afford dynamic verboseness, which is important because: _"A language should be designed in terms of an abstract syntax and it should have perhaps, several forms of concrete syntax: one which is easy to write and maybe quite abbreviated; another which is good to look at and maybe quite fancy... and another, which is easy to make computers manipulate... all should be based on the same abstract syntax... the abstract syntax is what the theoreticians will use and one or more of the concrete syntaxes is what the practitioners will use."_ -- John McCarthy, creator of Lisp. - DYNAMIC CODE-ARRANGEMENT: code should be able to be rearranged in order of start-to-finish/low-to-high-level or finish-to-start/high-to-low-level, because each is beneficial at various points when writing/reading code. Today, code is fixed how it is written. Typically in either an imperative way, displaying steps chronologically from start-to-finish, akin to chaining e.g. data.func1.func2.func3, or in a functional way: from outside-in, e.g.: func3(func2(func1(data))), but when reading to understand the order of execution you have to read from the innermost to the outermost expression. This is problematic, at various times, and has arguably made functional programming less accessible to newcomers. But both ways of reading are desirable, at different times: At one time you just care about viewing the overall result/conclusion (e.g. func3 and what it returns), and potentially working your way backwards/inwards a little bit. Maybe you start with an end goal in mind (name of a function), and then drill down to a more and more concrete implementation. But at another time you care about going the other way around: seeing how it is executed from start to the finishing result/conclusion (think: piping). This duality of thinking/reading mirrors how we approach reading in other domains. This feature could be enabled by content-addressable code, since the arrangement of code itself could be made more malleable and dynamic. See: content-addressable code. - Not indentation based (counter-inspired by Python), since it is brittle: copy/paste bugs when the source and destination are indented differently. But also not require semicolons. Inspired by Ruby, and semi-colon-free JS. Might consider indentation-based if the language has a standard style and formatter. - Fast feedback to the programmer is second top priority. Inspired by TypeScript hints, QuokkaJS (!), Webpack Hot Reload, and Expo Live Reload. - REPL / interactive shell. Can be done even if compiled, by having an interpreter on top of a VM to the compiler. - REFACTORABILITY / CHANGE-ABILITY. - Similar-looking and non-interacting code-lines should be able to change place without breaking anything. Counter-inspired by not able to add comma to last line in JSON, not being able to reorder/extract from comma-separated multi-line variable declarations in JS, and also counter-inspired by the contextualised expression terminators in Erlang. - Consistent syntax, optimized for code refactoring: "The language syntax facilitates code reuse by making it easy to move code from local block → local function → global function". Inspired by Jai. - BACKWARD- AND FORWARD-COMPATIBLE. Should be able to not worry about (or make poor future tradeoffs due to) backward-compatibility. (Counter-inspired by ECMAScript.) To make the language optimally and freely evolvable, and worriless to upgrade. Backward-compatibility and Forwards-compatibility: Code in a one language version should be transformable (in a legible way) to another version (both ways; backward and forward). In the case of lossy changes, the old version should be stored (so it is revertible), and in the case of a "gainy" changes then the compiler should notify the programmer where in the code it is now missing explicit information (based identifying locations in the code where the old language constructs are used). There should also be solutions to either: have simple CLI tools to AUTOMATICALLY REFACTOR OLD CODE TO NEW LANGUAGE VERSIONS, to always stay optimally ADAPTABLE, without having breaking changes. Maybe using some form of BUILT-IN SELF-TO-SELF TRANSPILATION. Will likely need to be able to treat code-as-data. Might need compile-time macros. Or a solution could be to: with every breaking language revision, include an incremental language adapter, which would allow upgrading whilst ensuring backward compatibility. Could be solved with Mechanical Source Transformation, enabled by gofmt, so developers can use gofix to automatically rewrite programs that use old APIs to use newer ones. Which is crucial in MANAGING BREAKING CHANGES. A breaking (aka. widely deviating) change, should in effect, not actually break anything (that current languages and systems do, is considered a "pretty costly" design flaw). - Content-addressable code: names of functions are simply a uniquely identifiable hash of their contents. The name (and the type) is only materialized in a single place, and stored alongside the AST in the codebase. Avoids renaming leading to breaking third-parties, and avoids defensively supporting and deprecating several versions of functions. Avoids codebase-wide text-manipulation, eliminates builds and dependency conflicts, robustly supports dynamic code deployment. Code would also need to be stored immutably and append-only for this to work. All inspired by Unison. - Configuration, but with sane conventional defaults. (Not rely solely on convention over configuration (CoC), due to potentially too much implicitness/magic. Counter-inspired by Ruby on Rails.) "if you apply [CoC] dogmatically you end up with an awful lot of convention that you have to keep in your head. It's always a question of balance; Hard Coding vs. Configuring vs. Convention, and it's not easy to hit the optimum (which depends on the circumstances)." as Peer Reynders reminded us. - The language should be "open to extension" by any in the community, without permission. So that it can evolve and converge to a consensus, based on real-world experience and feedback. This is mirrored in the important talk Growing a Language, by Guy Steele, and the point on crucial evolvability. But the culture of the language community should _not_ encourage bending the language in unintended ways just for the sake of it, as staying close to the overarching general-purpose language (GPL) makes knowledge transfer/usage generally applicable across domains (being able to move between projects within the same language should _in general_ be made easy), and afford a more cohesive ecosystem. Counter-inspired by DSL's (see: avoid DSL's). - Modularity. Module system which is sensible. Counter-inspired by the NodeJS controversy. Code-splittable and tree-shakeable. Inspired by Rollup. Function-level dead code elimination, inspired by Elm. This is possible in Elm because functions there cannot be redefined or removed at runtime. This could potentially conflict with the envisioned Hot Upgrade feature inspired by Clojure (see: Interactive). This problem could perhaps be removed by disallowing modification/overloading of functions and data types in the standard library (or any 3rd party library). Alternatively: it should be possible to specify what parts of the application should be tied down and optimized (the client code), and which part should, at the potential expense of larger assets, be Hot Upgradeable (the server code). - Standard library should even be tree-shakeable. Inspired by Zig. - QUICK TO GET STARTED AND PRODUCE SOMETHING. Inspired by JS. Counter-inspired by JS tooling. - Not too unfamiliar (to a large group of programmers, and to what they teach in universities). "Familiarity and a smooth upgrade path is a really big deal." source - Sensible, friendly, and directly helpful error messages. Inspired by Elm. - DATA FIRST FUNCTIONAL PROGRAMMING (DFFP). Mimics the style of object-orientation, but is simply structs and functions under-the-hood. (Also: functional programming patterns over procedural code.) Because it is human to see and visualize the world (as well as computing) in terms of objects and verbs, and to use verbs to signify the causal relations between objects. Focusing to heavily on only one of the paradigms (OOP or FP) can typically lead to anti-patterns (God classes/objects, Factory objects, and Singletons, in OOP), or program structures far removed from the business domain model which also has linguistically unintuitive syntax, as in FP (c.f. FP is not popular because it is backwards). This is since only 12% of natural languages start with the verb, either Verb-Subject-Object (VSO) or Verb-Object-Subject (VOS), ref), as FP tends to do. So Subject-Verb-Object (SVO) should be preferred. 88% of natural languages start with something concrete, the Subject/Object (and the Object of one sentence is typically the Subject of the next sentence; similar to call chains). I think the SVO alignment is one significant but under-appreciated reason for OOP's success. That, together with enabling a stepwise/imperative construction of programs, makes for a more intuitive approach for beginners, which is vitally important for onboarding & growth. "Objects and methods" could be merely syntax sugar for structs and functions (see: interchangeability of method-style and procedure-call-style, or the pipe first operator in ReScript, which also illustrates _emulating object-oriented programming_), if one leaves out troublesome inheritance (which might be good, since composition > inheritance). Inspired by Golang. The quote "Data, not behavior, is the more crucial part of programming." is attributed to Linus Torvalds and Fred Brooks. If data is the focus-point, the language should mirror that. Interestingly, in addition to the more intuitive API, data-first also affords better IDE integration, simpler compiler errors, and more accurate type inference. Inspired by ReScript. - Functional programming patterns like .map, .filter, over procedural code like for-loops etc., since the latter would encourage mutating state, and we want immutability. - Tree-shakeable code (esp. useful for client-server webapps). So it should need a source code dependency between the calling code and the called function. Which makes the language more FP than OOP, according to one definition of FP vs. OOP. In general, shifting concerns from runtime to compile-time is considered a good thing, as it makes the language more predictable, optimizable, and affords helpful coding tools. Having consequences of code changes appear at runtime is a bad thing (see: The Fragile Base Class problem of OOP). - Referentially transparent expressions. Which means variables cannot be reassigned, so a name will always refer to the same value (see principle: "Things that are different should look different"). Inspired by Haskell. This feature should enable a high degree of modularization but could also lead to easy automatic parallelization and memoization. - Formally verifiable / provable. Nice-to-have, not must-have. - Automatic TCO (tail-call optimization). To keep the processing lean, and avoid potential stack overflow, by avoiding allocating new stack frames for recursive function calls. Counter-inspired by Clojure / JVM. Inspired by Scheme and ML-languages, and Lua. TCO should align well with the desire to have a language where the programmer can have and mutate a mental model to carry forward, without having to rely on remembering and returning to previously remembered values. - AGGRESSIVELY PARALLELIZABLE: Parallelization made natural. Aided by pure functions. The language should nudge programmers and make it easy/natural to use parallelism, through language constructs (like executing several sequential lines simultaneously). To avoid common _overly_ sequential thinking, which can lead to suboptimal performance (due to not parallelizing work). But humans think sequentially. So we ought to pay heed to Dijkstra's wisdom that since: our intellectual powers to visualize processes evolving in time are relatively poorly developed, we should shorten the conceptual gap between the static program and the dynamic process, by spreading out the process in text space. In simpler terms: enabling the programmer to _trace the flow of execution_ by simply reading the code. Another important reason the language should steer the programmer to aggressively utilize parallelization, is due to Amdahl's Law, which states that, when parallelizing, the limiting factor will be the serialized portion of the program. Notably the queueing delay, due to contention over shared resources, like CPU time. So whatever part of the program which is not parallelized, will eventually, under high enough load, turn into a bottleneck. The language construct nudging developers to parallelization could be inspired by Verilog's fork/join construct, or the very similar Nurseries, which are an alternative to go statements (since go statements don't afford local reasoning, automatic error propagation and reliable resource cleanup, though some may be achieved in a WaitGroup). But as opposed to the fork/join example, the language should enforce a deterministic order upon joining, which should simply be guaranteed by the sequential top-down order of the lines in the fork/join code block (a novel idea, to my knowledge, which would need to be experimented with thoroughly... more thoughts in this issue). NB: Need to research if on today's hardware, automatic parallelization could in fact be a potential pessimization in practice instead of an optimization, as Richard Feldman pointed out. In any case, we do not envision making every function call parallelized, but to make simple, contained constructs (like nurseries, fork/join), that the programmer can use to signify separable pieces of the problem/algorithm. The runtime should then parallelize those portions aggressively. - Alternatively: Take inspiration from Chapel, by providing core primitives to control parallelization directly. But preferably through declarative means and not through such imperative control structures as for-loops, which Chapel uses. - Alternatively: Take inspiration from Golang's elimination of the sync/async distinction and allow programming everything in a sequential manner, but do a degree of parallelism under the hood (so concurrency, in practise). The sync/async barrier elimination, however, doesn't necessarily nudge programmers towards using parallelization (spinning off new threads) within the context of a program (thread). That style might conflict, or it might be synergistic with the goal of nudging programmers towards making more use of parallelization. - SYNTAX ENABLED PARALLELIZATION. Inspired by Verilog and Chapel. Ideally, the language runtime should be able to use parallelization to handle multiple independent processes (like client/server requests; goroutines for concurrency), but also _automatically distribute a single program across multiple CPU cores (facilitated implicitly by the language constructs/structure, without special imperative directives like thread/go)_ when those cores are idle. To do that, the language should not attempt to automatically make _specifically sequential_ code parallel, since such automatic parallelization requires complex program analysis based on parameters not available at compile-time. Instead, the language should nudge towards constructs that afford natural use of multi-threading instead of single-threading (cf. principle that a language should afford _scalable modes of thinking_). But without compromising readability/reasonability, which is the top priority. The programmer should be concerned with, and simply DESCRIBE INDEPENDENT SETS OF CAUSAL/LOGICAL CONNECTIONS, and the language runtime should automatically take care of as much parallelization as possible/needed. Inspired by Haskell. - Safe parallelization. Inspired by Haskell. - Compiled, but also interpreted and/or incrementally compiled (for dev mode). Inspired by Dart.. - Interpreted / incrementally compiled: So developer can write quick scripts and get fast feedback. Sacrifices runtime speed for compile-speed. Except it also needs quick startup/load time. - Compiled: For production. Sacrifices compile-speed for runtime speed. Compiles to a binary. Inspired by Deno. - Small core language: Compiled down to a small instructions set, which can be used/targeted as a starting point to generate code for other programming languages (i.e. generate JS). - Portability: Be able to target and run on multiple computer architectures. - Easy to build from source. Inspired by Zig and Golang. - MUTABLE API, and OPPORTUNISTIC IN-PLACE MUTATION, but data structures are automatically made IMMUTABLE WHEN SHARED. Inspired by Roc. Immutable/persistent data structures (like Lean-HAMT) and structural sharing, to allow incremental update, while also avoiding duplication of data. Inspired by Clojure. - In-place mutation, where data structures only become immutable when they're shared (presumes keeping track of borrowing / reference counting). Inspired by Rust, Roc, and Clojure's transients. Immutability gives the benefit of facilitating concurrency and avoid race-conditions. As a bonus you could get time-series and thus time-travel for data. - Mutable API: The desirability of a mutable API (mutating objects instead of always having to pass in functions) is inspired by the JS libraries Immer and Valtio. But for algorithms, instead of using the mutable API in an imperative style, it should allow keeping to a functional style, possibly with something akin to Clojure's transients. Alternatively: A mutable context (block scope) could be mandated for mutations (similar to Immer), which could also afford resource cleanup (if we want to avoid having a GC). Inspired by Rust. - Deep immutability: Cloning/Copying a data structure should not simply copy references below the first level, or if the data structure contains certain data structures. Because it is unintuitive/unexpected: a copy should be a full copy (at least as far as the programmer is concerned; it can use structural sharing under-the-hood). Counter-inspired by JS/TS. Alternatively: Instead of using Immutability to defeat Shared Mutable State, restrict the split-brain duplicity of keeping onto a reference to some data while also sharing a reference to it, like Rust does: by simply disallowing local references to data after it has been shared (aka. "moving" data). - VERY CONSTRAINED. Since _"constraints liberate, liberties constrain"_, as Bjarnason said. Inspired by Golang's minimalism, and Elm's guardrails. For learnability and maintainability. Since discipline doesn't scale (obligatory xkcd: with too much power, and the wrong nudges, all it takes is a moment of laziness/crunch-time to corrupt a strong foundation), and a complex language affords nerd-sniping kinds of puzzles, and bikeshedding and idiomatic analysis-paralysis. Counter-inspired by Haskell. The virtue of functional programming is that it subtracts features that are too-powerful/footguns (compared to OOP), namely: mutation & side-effects. The language designers should take care of and standardize all the idiomacy (natural modes of expression in the language). _"Inside every big ugly language there is a small beautiful language trying to come out."_ -- sinelaw. The language should assume the developer is an _unexperienced, lazy, (immediately) forgetful, and habitual_ creature. As long as software development is done by mere humans. This assumption sets the bar (the worst case), and is a good principle for DX, as well as UX. The constrained nature of the language should allow for quick learning and proficiency. Complexity should lie in the system and domain, not the language. When the language restricts what _can_ be done, it's easier to understand what _was_ done (a smaller space of possibilities reduces ambiguity and increases predictability, which gives speed for everyone, at a small initial learning cost). The language should avoid Pit of Despair programming, and leave the programmer in the Pit of Success: where its rules encourage you to write correct code in the first place. Inspired by Eric Lippert, but also by Rust. - Names: No _alias names_ for keywords in the language or for functions in the standard library (except for as documentation reference to other languages). Inspired by Python ("explicit over implicit", "one way over multiple ways"). Counter-inspired by Perl (postmodern plurality) and _aliasing_ in the Ramda library. _All things tend toward disorder, as programmers it is our job to Fight Entropy._ The language should favor one consistent vocabulary, since it increases predictability and reduces variability. Even at the cost of expressiveness (the language should afford just enough expressiveness for the domain, see: configurable language, community grown). Names should not mimic any other programming language per se, but attempt to cater to complete beginners, because notation has a large impact on novices, a principle inspired by Quorum. There should be a VS Code plugin that allows people coming from various languages to type function names as they know them and the editor will translate on the fly. E.g. typing in array.filter gets turned into array.keep in the code. - Guardrails: "_<insert your favorite programming paradigm here> works extremely well if used correctly."_ as Willy Schott said. The ideal programming language should both WORK EXTREMELY WELL EVEN WHEN USED INCORRECTLY (which all powerful tools will be), but first and foremost BE EXTREMELY HARD TO USE INCORRECTLY. Inspired by Rust and Elm. - Not overly terse. Counter-inspired by C. Maybe give compiler warnings if the programmer writes names with less than about 4 characters. READING >>> WRITING, since time spent reading is well over 10x time spent writing (inspired by Robert C. Martin), and writing can be alleviated with auto-complete, text macro expansions, and snippets, in the IDE. - No runtime reflection. Counter-inspired by meta-programming and runtime type inspection in Ruby. - Not overly verbose. Counter-inspired by XML and Java. Maybe compiler warnings if the programmer writes names with more than about 20 characters. - [The Rule of Least Power] (https://www.w3.org/2001/tag/doc/leastPower.html) (by WC3), suggest a language should be the least powerful language still suited for its purpose. To minimise its complexity and surface-area. For better reuse, but more importantly: to make programs, data, and (I will include) _data flows_, easier to analyse and predict. Inspired by FSM & XState. It needs, however, to be just powerful enough to be generally useful (and not limited to a DSL). Possibly Turing-complete. Given these considerations, a Lisp-style language comes to mind. But there's reasons Lisp never became hugely popular. My guess: readability. So while it could be a Lisp-language (or compile to one), it should read better than one. - _Removing variability in the syntax makes it more targetable for tooling and static analysis._ This further benefits the ecosystem. - It should be small, but extensible by using simple primitives (see: community grown, configurable language). Pragmatically, it should use LLVM to compile to binary. Inspired by Roc. The language should probably be built using OCaml, Rust, Racket or maybe Haskell (LLVM has bindings to these languages). Should do more with less. Inspired by Lisp. Since predictability is good for humans reading, and for machines interpreting, and if it's predictable to machines, humans also benefit. IMPORTANT: _"As one adds features to a language, it ramps up the complexity of the interpreter. The complexity of an analyzer rises in tandem."_ - Matt Might, on static analysis - CODE-FORMATTER, like gofmt, inspired by Golang. A tool to auto-format code into a standard. Since standardisation creates readability and faster onboarding of new developers. It also enables mechanical source transformation, which is crucial for language evolvability. - Self-hosting: In the future, the language should _maybe_ be made self-hosting, meaning it's compiler would be built in its own language. For portability and independence. But it's more important that the language is built initially (using LLVM) to facilitate the targeted usecases (webapp + systems dev.), rather than being implicitly optimized for writing a compiler. Also: building a compiler in the language could potentially mean dealing with so many low-level concerns that the restricted and high-level nature of the language will be compromised. But then again, the language should ideally be suitable for systems development... - Containability and explicitness. Inspired by PURE FUNCTIONS. Perhaps the language should even restrict a function's scope only to what's sent in through its parameters. So no one can reference hidden inputs (i.e. side-causes). Thus enforcing more predictable functions, where it is always apparent where it is used: what the function takes in and what it returns. So to achieve partial application of functions (i.e. useful closures), without addressing the outer scope implicitly, could be to supply constants from the outer scope as default/preset/front-loaded parameters. Since "explicit is better than implicit" (inspired by Python's principles). That way, they would be declared in the function signature, so you don't have to dive into the function to discover/investigate them. With the added benefit that the function could be customized by the caller through overriding the defaults. - Memoization, automatically, but measured and only applied dynamically when runtime finds it beneficial. Aided by pure functions. The programmer shouldn't have to think about memoization when programming, but should be able to tune the degree of memoization (since it is a space/time tradeoff) through general configuration, for advanced cases not optimal from the default. Run time optimisations such as these are not critical features, but certainly nice to have, and should be considered in the language design in those cases where it can affect the implementation of the language runtime. Memoization of math might not always be worth it (06:43 @ Andrew Kelley on Data-oriented design) so the adaptive runtime should measure math calculations and decide whether or not to memoize them in main memory (RAM), or just recompute the calculations because the CPU and it's cache are so fast and accessing RAM would be slower. These are hardware concerns that are subject to change as hardware progresses, and such concerns should thus not be encoded in the language syntax, but transparently taken care of by the language's runtime environment. See: adaptive runtime. - Explicit imports, so tree-shaking (to remove unused code) can be done. Inspired by JS. Also, so that it is clear where imported functions come from. Counter-inspired by Golang. - Pattern matching. Inspired by Elixir, Rust and ReScript. The expression-oriented nature of the language should make this natural, without extra/fancy syntax. Pattern matching could preferably replace if/else conditional logic, perform totality checking to ensure you've covered every possible condition, and even enable conditional branching based on type. - Not file boundary dependent. Can be split into files, but execution shouldn't be dependent on file boundaries. So the programmer is free to keep code tightly together. Inspired by SolidJS. - Niceties. Inspired by Bagel. - "Only single-quotes can be used for strings, and all strings are template strings and can be multiline ('Hello, ${firstName}!')" - No triple-equals; double-equals will work the way triple-equals does in JS. - Range operator for creating iterators of numbers (5..10, 0..arr.length, etc). - No magic / hidden control. Control-flow should be easy to trace, because it makes it easy to understand and debug. Less magic. Counter-inspired by Ruby on Rails. Inspired by Elixir Phoenix routing / endpoint plugs. Testing isolated parts is made possible by explicitness. Explicit is better than implicit. Inspired by Python's principle. Explicitness makes testing isolated parts of the system possible. So Explicit > Implicit, almost always. (Although implicitness is preferred when one may intuitively and robustly determine the convention through the context. E.g. Needing self. references to access class variables inside the class methods would just be noise. This is counter-inspired by Python, and inspired by Ruby. However, using self and this are considered an anti-pattern in general.) - Make Inversion of Control (IoC) hard/impossible (?). Should ideally always return control to the programmer, not take it away. To enable the programmer to always follow the control-flow by simply reading and following references. Thus, yield should also be avoided (counter-inspired by Ruby). Problem: Could make domain code dependent on integrations which goes against the dependency inversion rule. So other patterns, like containing integration coupling in an intermediary abstraction ('port/adapter' function or library, or 'channels'), would need to be developed. - Libraries over frameworks, as a strongly recommended community convention (because frameworks cannot be prevented by a language, afaik). Frameworks utilise inversion of control. That creates Stack Traces which are really hard to debug, because they reference the framework and not your own code, esp. problematic with concurrency. And when yielding control to various (micro-) frameworks, compatibility becomes a specific issue. The programmer shouldn't ever have to ask: "Is this library/framework compatible with this other one?". Counter-inspired by JS Fatigue. Or have to ask "Where is the execution path of this program?". Counter-inspired by the magic of Ruby on Rails. When the control is always returned to the programmer (no IoC), he/she may likely mix and match more as pleased, without up-front worrying about compatibility (leading to analysis paralysis). - Meta-programming: No first-class macros (runtime), since it is a too powerful footgun. But should have compile-time macros. Inspired by Clojure. So that the language can be extended by the community, and so that legacy code could be updated to latest language version by processing the code with macros to transform the syntax. - EXPRESSIONS OVER STATEMENTS. The calling code should always get something back (Is. 55:11). Because the returned object can be further chained. Inspired by Clojure and Haskell. Counter-inspired by JavaScript. Statements suck, as even the inventor of JavaScript, Brendan Eich admits. A goal should be to eliminate the subjective/anthropocentric bias that afflicts programming (especially the Imperative kind), because: _It is not you, the programmer, which is, or should be, calling code, but code should be calling code (and not terminating in the void, like as if it's you the programmer who is acting on the machine)._ - Abstractions which are powerful, made from simple primitives. _Maybe_ homoiconicity... since it would make writing the compiler easier, and making the language more readily available to evolve in the community on its own (permissionless). Inspired by Lisp and Clojure's Rich Hickey. - But this would allow meta-programming, and the associated complexity..? - The language should maybe also not be so powerful that programs become entirely composed by very high-level domain-specific abstractions, since it encourages esotericity and sociolects, but most importantly: code indirection when reading/browsing. Coding should not feel like designing an AST, so should try to encourage keeping the code flattened (by piping perhaps?) and as down-to-earth as possible. Could maybe be alleviated by an IDE plugin which would allow temporary automatic code inlining (editable previews). - REVERSIBLE DEBUGGING / TIME-TRAVEL DEBUGGING (TTD). “Reverse debugging is the ability of a debugger to stop after a failure in a program has been observed and go back into the history of the execution to uncover the reason for the failure.” Jakob Engblom. Inspired by Elm. Re: Accounting for human limitations and affording the most natural way of thinking: _"The problem you are trying to fix is at the end of a trail of breadcrumbs in the program’s execution history. You know the endpoint but you need to find where the beginning is, so working backward is the logical approach."_ source. Should at least have this. Could be enabled by, but not necessarily need: - Reversible / invertible control flow: "A reversible programming language produces code that can be stopped at any point, reversed to any point and executed again. Every state change can be undone." source. Maybe. Might not be feasible, or desirable, when it comes down to it. Might be aided by immutability, and persistent data structures (if they are extended with history-traversal / operation logging features, in addition to structural sharing). - TRANSPILER, CONFIGURABLE, so it could translate between all language dialects and variations. So that the language could evolve in multiple directions, and consolidate later, without harm. - Homoiconicity could perhaps enable this. - Eager evaluation, by default. Since it is more straightforward to reason about in most cases, simpler to analyze/monitor, and spreads memory consumption out more in time, than lazy evaluation which would pile up work and in worst case could overflow memory at an unexpected time (in any case, the programmer shouldn't have to worry about evaluation strategies, including space usage performance and evaluation stack usage). But it should use the more efficient lazy evaluation when currying functions, or chaining methods, unless intermediate error-handling or similar requires value realization (and even here, transducers could potentially alleviate unnecessary value realization). Inspired by Lazy.js. But this is an optimisation that could wait. Concurrent operations across threads/processes should not be lazy. You'd want to start exercising that machine as soon as possible. Counter-inspired by Haskell. Although it must be said: I am eager to be convinced that lazy is better and that space leakage and the bookkeeping overhead can be minimized). But in general, THE PROGRAMMER SHOULDN'T NEED TO WORRY ABOUT _WHEN_ THE MACHINE EXECUTES SOME PIECE OF CODE. Why wouldn't it be possible for a compiler to figure out at compile-time how and where functions are referenced, and choose eager or lazy evaluation depending on which is more suitable? For sequentially chaining of operations on data structures, it could be lazy, and for other operations (potentiallly further apart in the program, with potentially memory intensive operations in between..) it could choose to be eager (get the work done, so the memory can be free'd asap). Or? - ASYNC: BLOCKING/SYNC INTERFACE, BUT NON-BLOCKING I/O. Inspired by Golang, and to lesser extent JS / Node.js too. Should not have to litter code with async/await repeatedly (see: what color is your function? and the problem with function annotations, and async everything). Could be solved with Async Transparency, inspired by Hyperscript. But hiding the async nature with synchronous seeming abstractions could create a dangerous model-code gap with a potential impedance-mismatch and cause for design errors and bugs (inspired by Simon Brown)... So the language should make some abstractions around async simple (like goroutines in Golang). But also inspired by declarative and easily statically analysable async contexts, made with JSX, like Suspense (async if-statement), in React and SolidJS. - Alternatively: Async everything? The feature referential transparency, obtained if the language enforces Pure Functions (i.e. no side-effects), could potentially open up an avenue for making everything async by default (and letting the compiler insert await instructions where it figures out functions are not I/O bound and thus can be optimized direct/synchronous CPU execution instead, without the overhead of asynchronicity). - EASE OF REASONABILITY IS FIRST PRIORITY, and I believe it is best afforded by SIMPLE AND CLEAR ABSTRACTIONS (without model/code impedance mismatch, as made important by failures of ORM's and distributed contexts). The choice of sync interface here as opposed to async, is similar to how the wish for lazy evaluation by default was discarded for eager evaluation by default. One argument by Ryan Dahl of Node.js is that sync by default with explicit async (he mentiones goroutines in Go) is a nicer programming model than async everything (like in Node). Because it's easier to think through what the program is doing in one sequential control flow, than jumping into other function calls like in Node.js (if you are using async callbacks). See the "fragments your logic" point below. Reasonability is a top priority, so we cannot make a compromise here. - Async Channels do not block until the sender and receiver is ready, but simply put messages onto the queue stack of the receiver (see "Machines" concept under the "Scalable" feature). Counter-inspired by Golang. So that the sender can continue working without waiting for the receiver (freeing CPU time at the expense of memory). - Rich Hickey also has some good arguments against async by default (when implemented with callbacks as in JS), namely that it: - fragments your logic (spread out into handlers), instead of keeping it together. Programmer has to deal with multiple contexts at once (complicated), instead of one overarching context (simple). - callback handlers perform some action once in the future, but the state they are operating on may have mutated in the meanwhile. So it may give a false confidence in being able to get back to the state as it were when the callback was made. Want to avoid the dreaded Shared Mutable State. May be solved with only allowing immutable constructs. - On the other hand, having sync by default, and _async through Channels_: - gives the control back immediately (in line with functional composition) instead of functions that effectively evoke side-effects on the real world on the other end (as callback handlers do). In line with our principle: Always give control back to the programmer. - channels are generalized pieces of code that can handle many connections (pub/sub). - channels afford safe concurrency (thread handling), whilst with callback handlers (unless used in an event-loop system such as JS) the programmer has to ensure safe concurrency (which we don't want). - channels afford choice on when to handle an event, whereas with a callback it gets called whenever it gets called (event-loop). Channels work in line with our principle: Always give control back to the programmer. - All of the above have implications for reasonability. Needs to be investigated further... Golang's way of handling async seems to be the current gold standard, touted by many bright people, since _"Golang has eliminated the distinction between synchronous and asynchronous code"_ (by letting the programmer code everything in a sync fashion, but doing async I/O under the hood). Golang's principle of _"Don't communicate by sharing memory; share memory by communicating."_ avoids the dreaded Shared Mutable State and affords itself better to ensure SIMPLE, SAFE, AND SCALABLE MODES OF THINKING (our core principle): It's hard to think of something, if it has changed the next time you think about it (thus: immutability). Or if thinking about it changes it (manifesting in code the cognitive equivalent of Heisenbug's): Programmers need to be able to reason about a program's state without simultaneously modifying that state (inspired by CQRS). - Another, but more radical idea: THE PROGRAMMER SHOULDN'T HAVE TO THINK ABOUT _WHEN_ OR _WHERE_ THE CODE WILL RUN. It should be managed by the language runtime, based on the specified platform. If the program is a local program for one machine then it could be specified to run the work _synchronously_. If run over multiple machines, it could be specified to use async by default, to delegate work. But then if results don't arrive within time (from a remote machine/CPU-core), it could chose to perform the work itself, but lazily when the result is needed. So there should be some built-in semi-lazy evaluation measure based on CPU monitoring. Also, for the work it decides to do itself, the runtime should decide when to perform it: if the CPU-cores are idle, then it should eagerly execute the work, but if not then it should postpone just enough work so that the CPUs are adequately exercised. Currently, in languages without this nuanced model, the programmer has to make an either-or distinction based on a generalized heuristic of whether or not async or lazy makes sense, and apply it in a fixed fashion. But these assumptions do not necessarily hold for operational scenarios. Ideally, the programmer shouldn't have to think about such operational, low-level matters. - CONCURRENCY. FOR MULTI-CORE AND DISTRIBUTED. Probably a CSP model, or a similar or novel model, due to easier to debug concurrency. Inspired by Golang. - Async: Concurrency should integrate well with the async feature of the language. Default should be to easily ship tasks off to be completed elsewhere (other thread/process/worker/server). Inspired by Golang and JS. - Probably not implemented as an Actor Model. Since Actors statefulness is complex. Also, since events going all over the place in a non-stateful app, is harder to reason about than stricter promise-based operations (using callbacks under-the-hood). Counter-inspired by StimulusJS. Inspired by ReactJS. - Concurrency vs. Parallelism should be up to the runtime, not something the programmer should have to worry about. If the runtime has multiple cores, then parallelize the tasks onto those cores. If the runtime only has one core to work with, then interleave the execution of the tasks concurrently on that single core. - SCALABLE: From single core to multiple core CPUs, and from one to a distributed set of machines. Without needing refactors. Inspired by Alan Kay's vision of computing, and the purpose of the Actor Model, utilized in Erlang/Elixir and Pony's actors (w/ async functions). But rather than state-driven Actors, I'd rather want it implemented with "Machines", which simply gives stateless functions a call queue each. Inspired by Smalltalk, but stateless. They call each other by sending Messages (containing the parameters) to the other function's call queue, take into consideration error-handling, and the unreliability of distributed computing. (The caller chooses if their call should be sync/blocking or async/non-blocking, since sync/async aka. blocking/non-blocking, it is not a feature of the called function, but the call itself. Counter-inspired by JS) We name such functions "Machines". Each of them are in fact a mini-computer, or a computer-within-the-computer, if you will, but without inherent state (which ought to be stored and managed by a DB or in-memory DB). Such Machines should be able to be moved to distributed systems without rewriting the code. Inspired by Alan Kay and Actor Model systems (Akka), and languages where actors are first-class citizens, like in Pony. A single Machine could work as a minimal microservice, or better yet, a Cloud Function, but likely you'd want multiple endpoints which expose a Machine each. - NO GLOBAL VARIABLES: Because global variables don't match to a distributed setting, so scaling up couldn't be done without a rewrite. Instead, a function calls another function by sending a message. See aforementioned "Machine" concept. Messages can be passed through chains of function calls by piping and/or ...rest parameters. - No variable shadowing _within_ functions, but insides of a function may shadow the outside. Inspired by C#. Since variable shadowing is a reason to have keywords such as let, then not allowing variable shadowing could afford the opportunity to not have such keywords, for minimalism. But functions should shadow external variables to the function, since it ensures the writer/reader doesn't need to know about potential name collisions with external/global variables. - Ideally, for performance, when code is compiled to be run on a single machine, the compiler should be able to be optimise away the Mailboxes, so that Machines can be turned into (simpler and faster) synchronously executed functions. - Facilitate and nudge developers's towards creating "Functional Core, Imperative Shell" architectures (inspired by Bernhardt at 31:56 in his Boundaries talk), to preserve the purity of functions as far as possible, while also CONTAINING SIDE-EFFECTS: - CONFIGURABLE LANGUAGE: Platform config that encapsulate all I/O primitives, which introduces a separation between trusting a particular platform and trusting the language. Inspired by Roc. Could even make certain features of the language only available on certain platforms, i.e. Browser platform doesn't have access to low level memory management. So that the language can be as restrictive as possible for the environment, ensuring that code is written idiomatically for the target platform / environment, since a restricted language has value because for a given platform the programmer would encounter less diversity in the language and thus have less to learn. This strikes a balance between on one hand providing sharp knives as global tools programmers can apply anywhere (i.e. potential footguns, leading to The Pit of Despair like in C++), and on the other hand avoiding being so restrictive that programmers can't talk/write/think about what they want/need to for their given environment. The language itself should be massively configurable: IT IS NOT REASONABLE TO ASSUME THAT THE LANGUAGE DESIGNERS WILL HAVE ACCOUNTED FOR ALL POSSIBLE USECASES (various memory management strategies etc.). So the language primitives/keywords should be able to be given different underlying effects (e.g. stack vs. heap allocation) based on which platform or use-case is specified (without having to be explicit about every such effect in every environment). But the effects should be inconsequential for the reasonability of the code. Meaning that they should be at the bare-metal performance level, not at the level where operators are overloaded to do something different, cf. our principle that _things that are different (i.e. have different effects at the language level the programmer is operating at) should look different_. The platform config will depend on which implementation makes most sense for that platform (or use-case?) (i.e. browser webapp vs. systems development, vs. game development, potentially). The language should be configurable by libraries, that will define how it works, and can extend the core to platforms where the programmer needs to think about specific matters to that platform. Inspired by Clojure. THE SAME PROGRAM SPECIFICATION SHOULD BE ABLE TO HAVE DIFFERENT RUNTIME CHARACTERISTICS ON DIFFERENT PLATFORMS, depending on the platform configuration. This could be enabled by the programming language concerning itself with modeling causal relationships, instead of place-oriented-programming. - ENCAPSULATED I/O, so functions can avoid having side-effects. Inspired by Haskell. ALTERNATIVE #1: Algebraic Effects for I/O, so that side-effects can be contained in a given context. Algebraic Effects are also a powerful general concept that could help with concurrecy, async/await, generators, backtracking, etc. Inspired by OCaml. ALTERNATIVE #2: use an IO action of an IO type (inaccurately named "IO Monad" at 30:44 in the Boundaries talk), transparently (without actually having to deal with the concept of a Monad). Where you effectively construct a sequence of I/O operations to be executed later. Inspired by Haskells separation between pure functional code and code that produces external effects (cf. "functional core, imperative shell" concept (at 31:56 in the Boundaries talk) which was inspired by Haskell). Something like this is needed because the Mailbox is stateful (it is constructive/destructive, like a queue), and I/O messaging would be a side-effect. The Machine/Mailbox is inspired by the Actor Model from Erlang. Ideally, since all I/O is wrapped, it should be able to turn on/off the execution of IO actions, based on setting some initial config. This could be useful for testing. You could even do a sample run to collect data, which you could snapshot to use as mock data for test runs. ALTERNATIVE #3: Potentially by syntactic rules: A function should _either return a value, or don't return anything_ (i.e. be simply a void procedure). And a procedure can never be placed within a function. NB: Could lead to the colored functions problem. ALTERNATIVE #4: Use Uniqueness Type, which allows mutability and pass-by-value while also preserving the crucial REFERENTIAL TRANSPARENCY (since side-effects are ok in a pure language as long as variables are never used more than once). Inspired by Clean and Idris. Possibly use Simplified Uniqueness Typing, inspired by Morrow. ALTERNATIVE #5: Simply be able to turn on/off side-effects like external output operations. If all output operations are done through an IO module in the standard library, it could afford a simple "off" switch, to be able to do testing. That would prevent side-effects from acting on the outside world during testing. The challenge is side-causes (aka. hidden inputs), however. The language could have the IO module require a default/fallback parameter to be set for external input operations. IO.readFromFile(fileName, "Default file content fallback."). Which would be used during testing (the benefit being that the mocks would already be present). Another problem with side-effects have to do with ecosystem (especially interop with other languages): If you use a 3rd party package, how do you know it won't leak data to a 3rd party server during runtime? This ought to be solved by a sandboxed runtime environment (inspired by Deno), where it should automatically log any attempts at IO access not explicitly made through your own application code (using the language's IO module). Inspired and counter-inspired by Elm. - Reactive. Inspired by Functional Reactive Programming, and Elm, and The Reactive Manifesto. Though the latter is geared at distributed systems, it could also be a model for local computation (rf. Actor model, and Akka). The programming language should make default and implicit the features of reactivity and streaming, as opposed to preloading and batch processing. (Reactive Streaming Data: Asynchronous non-blocking stream processing with backpressure.) - No single-threaded event loop that can block the main thread. Counter-inspired by JS. - Transducers, under-the-hood, to compose and collate/reduce transformation functions (chains of map, filter etc. turn into a single function, visualised here). Chaining function calls should use language-supported transducers implicitly. Language should not require special compose syntax. - Gradually typed, as types can add boilerplate, create unnecessary friction, obstruct a programmer's tinkering flow-state, and create noise in the code. Counter-inspired by TypeScript, and inspired by Elm and Jai. As many types as possible should be inferred. Inspired by TypeScript but even more inspired by OCaml and ReScript. - No runtime type errors. Inspired by Elm (and Haskell). See 'Error Handling & Nullability'. - Union Types: Types should be associative/commutative/composable/symmetric (i.e. A|B should equal B|A), inspired by Dotty/Scala3, and the 'Maybe Not' talk by Rich Hickey. - Types should be enforced statically at program exit boundaries (so external libraries or outgoing I/O are ensured existing typings). - Structural subtyping (inspired by TypeScript, OCaml), instead of nominally typed (counter-inspired by Java and Haskell). Since it is the closest you'll get to duck-typing within a statically typed language. But it should also have support for nominal types at the few cases where that might be beneficial (i.e. opaque types, not possible with only structural subtyping). Inspired by ReScript, and counter-inspired by TypeScript. - Strongly typed (checked at compile time), not weakly typed, since implicit type coercion (at runtime) can be unpredictable, and variables that can potentially change their type at runtime is madness. Inspired by TypeScript and ReScript. Counter-inspired by JavaScript. No runtime/ad-hoc polymorphism (aka. dynamic dispatch), so function/operator overloading would not be possible (e.g. + can't be used both for summing ints and joining strings, so you'd have to use ++ or similar), but we'd gain the more important ability to fully infer static types for programs, without having to write type annotations, and compiling could get really really fast. Inspired by OCaml. Counter-inspired by Clojure, Java, Ruby. - Generics / Type parameters / Parametric polymorphism. Inspired by ReScript and OCaml. Counter-inspired by how C++ and Java handles generics. - Type inference, sound and fully decidable and with 100% coverage. Inspired by OCaml and Roc. To not have to declare types everywhere. For increased readability and convenience (though not essential, cf. popularity of Rust). But local type inference inside the body of functions are what's most important (inspired by Scala), since functions input/output types should always be declared, for documentation purposes. But they could probably be generated after the prototyping phase / exploratory coding is done and you want to ossify the code. In that case, they should not be inline (like in TS), but next to the function definition (like in Elm). - Pragmatic type bindings for external libraries: should allow you to write type bindings that mirror how you will use the library in your own project, instead of getting stuck at generalizing potentially complex types. Inspired by ReScript. - Typed Holes / Meta Variables. Inspired by Idris. Since it "allows computing with incomplete programs", and "allow us to inspect the local typing context from where the hole was placed. This allows us to see what information is directly available when looking to fill the hole". I.e. the compiler provides hints about its attempt to infer the type of the missing value (aka. hole). As opposed to either requiring that a program is fully typed, it can afford a Live programming environment that give feedback to the programmer while editing about how it would be executed. - COMPOSABLE. Favour composition over inheritance. Inspired by Robert C. Martin, Martin Fowler, and JSX in React. Composability entails it should be easy to write code that is _declarative_, _isolated_ and _order-independent_. See "strongly typed". - Immutability enables composability, because it enables order-independence through managed effects. - Memory safe, ergonomic, and fast. should be safely and implicitly handled by the language, without a runtime GC. - No Garbage-Collector (GC), but also no garbage. Deterministic Object lifetimes, and Ownership tracking (affinity type system). Inspired by Rust and Carp. Alternatively, the language could take inspiration from concatenative programming languages which doesn't generate garbage by design, and uses the stack heavily. Garbage is a symptom of memorizing, which is tedious for the programmer, as well as the compiler, as well as the runtime. Garbage comes when you have to clean up something you memorised (allocated memory for, but somehow stopped using further on). Concatenative programming is closely related to FP through continuation-passing style (CPS) and tail-call optimization (TCO). The language/compiler should utilize CPS where possible, so as to reduce/optimize usage of the stack. It should be able to store a continuation (equivalent to persisting the stack to RAM/Disk) so that stateless programs (like a web server) could be restarted near a point of interruption/error (when the client makes the request again), to simulate statefulness. Inspired by Scheme. - Memory-management & safety. Automatic Reference Counting. Inspired by Roc. Maybe a Borrow Checker, for memory-safety. Inspired by Rust. But ideally, Ownership and Borrowing should be implicit by the programming language, so the programmer wouldn't have to think about low-level concerns such as memory management (e.g. what goes on the stack vs. the heap) or various kinds of references. To avoid conceptual overhead of manual memory management (as with explicit borrowing semantics), the language should perhaps use or take inspiration from Koka's Perceus Optimized Reference Counting. Koka apparently allows even more precise reference counting (see sect: 2.2) than Rust. - Platform user can config memory management strategy. Inspired by Roc. It should for example enable choosing Arena-allocation strategy for HTTP request-response cycles, which would be optimal. - Arena Allocation. Way XXX - Secure from the start. Secure runtime. Inspired by Deno. Safety has to be a built-in design-goal from the start, it cannot be added on later. As evidenced by the justification of existence of Deno (Node was unsafe), and Rust (C++ was unsafe). Also, see: memory safe. - POWERFUL PRIMITIVES OVER BATTERIES-INCLUDED: Few, but powerful and composable, core primitives. Based on very few fundamental concepts to learn. Inspired by Lisp. Prefer uniformity and consistency. Counter-inspired by the only half-way interchangeable expressions and statements in JS. - But avoid DSL's, since Domain-Specific Languages typically become mini-languages in their own right. Such languages are akin to dialects/sociolects that hinder generalised understanding and learnability (adds knowledge debt). Counter-inspired by Lisp (Lisp being too powerful) and Ruby. Even though it might be true that _“Domain-Specific Languages are the ultimate abstractions.”_, as Paul Hudak put it in 1998, some cross-domain terms are usually helpful for onboarding programmers, since they afford familiar knobs on which to hang the other unfamiliar code. Even if you don't understand the domain (or its plethora of abstractions), you would at least understand _something_. From where you could build your further understanding. - Small focused core, with powerful composable primitives, that lends itself well to abstraction. Language extensible by library authors. Strong convention and encouragement for abstractions based on generalizable JTBD naming, instead of business/domain-specific DSL's (reasoning above). - Modular composition, configurable. Inspired by Rollup plugins. - A language for library authors. Inspired by the success of C++. The language should be able to EVOLVE BY COMMUNITY CONVENTION, NOT BY CENTRALISED SPECIFICATION: the language itself should be extensible with libraries (would probably need to have some limited metaprogramming facility in the form of compile-time macros... good idea?). See: community grown, configurable language. - FAST BRANCHING AND MERGING: What would be important is to facilitate fast and simple language _merging_, due to all the branching that would appear (for the aforementioned reasons; library-driven). Inspired by Git (fast branching and merging was the big idea behind much of Git's success). So the community can easily find back together after a split/branch (if their ideas and goals come back in alignment, and they have converged to an agrement on the features again). See also: "forward- and backward-compatibility". - Be general purpose enough to at least write scripts and CLIs, but also web servers/clients. - Ergonomic to type. Prefer text over special characters like curly brackets (they are hard to tell apart from parentheses in JS). No littering of parentheses. Inspired by Ruby. Counter-inspired by JavaScript, Lisp, and JSON. - No super-powerful tools which may hurt you or others in the long run. Counter-inspired by meta-programming in Ruby. - Crash-safe. Can crash at any time and resume computation at exact same spot when restarted. Inspired by Erlang. - Piping, or some form of it. But always top-to-bottom or left-to-right. Inspired by Bash, and functional programming with pipes (Elixir, BuckleScript, and ts-belt). Data-first instead of data-last. - No Exceptions. Inspired by Golang. But Recoverable and Unrecoverable errors. Inspired by Rust. (Definitely no checked exceptions, as it breaks encapsulation by imposing behavior on the callee. Counter-inspired by Java). - Result data type, for error handling and validation. Inspired by Result from Rust and F#, and [Either from Haskell] (https://blog.thomasheartman.com/posts/haskells-maybe-and-either-types) and Elm, although it should not be called Either, as Either is a confusing misnomer. - Error handling & Nullability: Goal is to eliminate timid coding patterns like null checks everywhere. Counter-inspired by Golang. No implicit null or nil value. Meaning no runtime null errors (typically occurring far removed from their point of inception). Inspired by Elm and Rust. Ideally without having to explicitly declare Maybe aka. Option types (inspired by Hickey's Maybe Not). Could either automatically represent nilable variables as a union between the type and nil, so that the compiler can do null reference checks at compile-time. Inspired by Crystal. Or, _automatically but statically_ infer and create/augment a function's return type to a nullable reference type indicated by a ? after the typename, whenever there is an unhandled condition that could result in a null value. Or _automatically_ create a NullObject (see: NullObject pattern) of the function's declared return type. Maybe even better: let every type declare and handle their own empty state. If all types are defined in terms of Monoids, then _NULL_ COULD BE REPLACED BY THE IDENTITY VALUE (OF EACH MONOID), so that combinations within that type never fail, and never alter the result. NB: would make it hard to express something which was supposed to be there but which is missing, like a missing point on a graph curve, instead of plotting a definite 0. So would need careful consideration to choose this approach. - When a function becomes more capable (by widening its allowed input, e.g. string to Option<string>, or tightening its returned result, e.g. Option<string> to string) it shouldn't break callers (which then could result in cascading refactors, cf.: what color is your function?). A way to solve this would be if the language could automatically perform casting of such arguments/results, and have a type system that could account for that. Inspired by Flow, and counter-inspired by TS. Furthermore, the return type from functions using I/O (like IOMonad in Haskell), should always be augmented/inferred from static analysis. - _Variant Types_ for error-handling using return values (like Result<Type, Error>, inspired by Rust), instead of special syntax. Counter-inspired by Golang. - So that you have less avenues to explore when debugging and fewer branches to check when programming, so you can write Confident Code focused on the happy-path. - No possibility of failing silently during runtime (due to syntax errors). Counter-inspired by JS. - Compilation should be able to target some popular language & ecosystem, like TRANSPILE TO JAVASCRIPT OR COMPILE TO WASM, or potentially even the JVM, to get cross-platform interoperability. But not any target for any cost, if it would put unwieldy constraints on the language design. WASM seems like the best candidate. - Escape hatches: The language should have escape hatches that facilitate integration with other ecosystems, to aid in rapid adoption. This could compromise the strictness and guarantees of the language, but it should be possible for those who want to take on that risk/burden. - Library compatibility tool. So you can input a list of your stack of libraries, and it will tell if and where they are incompatible. Counter-inspired by JS Fatigue. - Small standard library. To have some common ground of consolidation, and to provide the basic and most common utils. So usage will be fairly standard, and coming into a new codebase not feel too foreign. But not too big standard library, since it would be connected to language updates, which are slower, and community competition is better for adaptability over the long run. - The minimal standard library should be designed and decided by one leader with good insight into what users need, and a strong appreciation for consistency. To avoid endless bikeshedding. This is the only place where the language should have a benevolent leader for a limited time. - COMMUNITY GROWN / off-hands-leadership: No BDFL, since it impedes evolution & diversity. _"When a langauge accepts bottom-up adaptations (from the users) it will handle new topics and new problems more efficiently than when it need to wait for top-down approval of such adaptations."_ (from Will ugly languages always bury pretty ones?). The language designer should more be an arbitrator in discussions the community can lean on regarding what should be the default convention. The designer(s) and stewards of the language should also be nice (so the community will be welcoming and thus flourish). Inspired by Ruby's "Matz is nice, so we are nice". Even though none of this is strictly a language "feature", it is nonetheless of _major_ impact to a language and its development, so it deserves a mention. Furthermore, that ALL PEOPLE SHOULD HAVE ACTUAL OWNERSHIP OF THE DEVELOPMENT OF THE LANGUAGE, is vital for contributions and growth. The language might not grow exactly where the designer intends, but a centralizing authority (like a BDFL) may just as well be stifling growth (and causing pain), as it purports to lead it. Counter-inspired by Elm and Clojure. Yes, wild growth might lead to some weeds (bad dialects/libraries), but leadership through conventions and good defaults could alleviate the potential analysis paralysis and decision fatigue. Counter-inspired by JS fatigue. Ultimately, the power and impetus should reside with The Man in The Arena, and that man/woman should always be able to be you. THE LANGUAGE STANDARD / MAINSTREAM SHOULD UPSTREAM/INCORPORATE CHANGES found to be popular in the wild (amongst all the various dialects/customisations), and at the same time ensure they are incorporated well (coherently and consistently). - Free experimentation on branches that can be upstreamed: It is hard to predict the effect of novel language features, especially before they've been tried in the wild for some time. Languages evolving by centralized committee tend to evolve slowly, as for each new feature they have to come to a consensus, and predict and test its use. Whereas in extensible languages anyone in the community may modify it without permission, and test it on their own. This is much faster, and dialects can be tried in parallel. Then they could be upstreamed back into to the mainstream language dialect. This could be a sweet-spot. - How could the language be VERY CONSTRAINED while at the same time be COMMUNITY GROWN? The language core should be very constrained around composition of a few core primitives (self-hosting), but it could be modified or built upon by others. So that it could evolve in multiple avenues of exploration, and gain from the competition. Where it would be up to the community to decide whether they want to use the constrained version(s) (suitable for large scale complex environments), which I prefer, or the bring-your-own syntax version(s) (suitable for small scale playful experimentation and research) which would inevitably appear. Inspirations here would be Lisp, Clojure and Racket. What would be important is to facilitate simple language _merging_, due to all the divergence that would appear. Inspired by Git. So the community could easily find back together after a split (if their ideas and goals come back in alignment, and they have converged to an agrement on the features again). - Single package directory: Some sort of singular reference to a library package information service. So the community can organise around one common point, instead of scattering. Inspired by NPM. But doesn't necessarily need to be centralised package download/storage, the storage/download could be decentralised. But would need to be safe. Cert signing? - Runtime environment: Be able to run on some existing popular cross-platform runtime (like WASM, or maybe the JVM?). Inspired by Clojure. And/Or have a very minimal programming language runtime (without a GC). Inspired by Rust. But the runtime should in any case handle the scheduling of goroutines, inspired by Golang. - ECOSYSTEM: INTEROPERABLE with one or more existing programming language ecosystems. To import or reuse libraries. Without too much ceremony. So the ecosystem doesn't have to start from scratch. Counter-inspired by Elm. _Smooth interoperability with existing ecosystems and other systems, minimising glue code, is one of the largest underestimated features of a language, in terms of enabling its success_. Inspired by C. While a fully integrated system can be very nice, it inevitably risks being disrupted by a thousand small cuts (i.e. are made irrelevant to a project because other tools/services outperform them on one or two critical features, or needs interop). Counter-inspired by Ruby on Rails, Lambdera, Elm and Dart. The world is heterogenous, and no single system yet has been able to solve all relevant problems for all people. Many have tried to own the world, like Smalltalk, Imba, Darklang, etc., but this can be an impediment to mass adoption. A language as a small focused tool which lives well in a heterogenous environment is the way to go. - C ABI: Compatible with the C language Application Binary Interface (ABI). So code in the language is usable from other languages. Inspired by Zig. Since compiling to WASM is desirable, WASM's C ABI could probably be used, instead of a separate implementation towards the C ABI. - "WebAssembly [WASM] describes a memory-safe, sandboxed execution environment" where WASM's security guarantees eliminates "dangerous features from its execution semantics, while maintaining compatibility with programs written for C/C++." NB: But WASM's restrictions _might_ conflict with runtime dynamism and the desired live REPL feature (inspired by Lisp/Clojure)...? "Since compiled code is immutable and not observable at runtime, WebAssembly programs are protected from control flow hijacking [code injection] attacks." - Editor integration: Should afford simple integration into editors/IDEs like VS Code. Typically via the Language Server Protocol (LSP). Inspired by Rust. Syntax highlighting, a language server (for autocomplete, error-checking (diagnostics), jump-to-definition etc.), via the Language Server Protocol (LSP). - Interactive: facilitates an IDE-plugin (VS Code) that shows the contents of data structures while coding. Enable REPL'ing into a live system. Inspired by Clojure. But with some security, so that a rouge/unwitting programmer can't destroy the system / state. Counter-inspired by Smalltalk. Some form of Hot Reload / Hot Upgrades at runtime, even though the language is statically typed. Perhaps by requiring that the swapped in functions must take in and return the same types as the previous version (i.e. enforced interface). Inspired by Facebook's usage of Haskell. NB: Might conflict with compiling to WASM, since WASM gives a restricted environment. See section on WASM environment. - "Comments should be separate from code, joined at IDE or joined via manual tooling. This would allow comments to span multiple lines/function and files. IDE could also alert when breaking changes are made. Pairs well with the Content-addressable code wish." Inspired by supermancho @ HN. You could also show/hide comments, and click on a particular piece of code or variable to see the comments for that. Without having to visually map references on a comment line to the actual variable, which is also prone to documentation drifting out of sync with the code it is documenting. With comments tied to content-adressable code, then when deleting/updating the code you also delete/update the comment. Renamings would be transparent and automatic, but when the code changes structurally the IDE could warn that the corresponding comment/doc needs to be updated. - The expansion of function definitions inline, on demand. "Take the definition, splice it into the call site, rename local variables to match the caller", as JonChesterfield @ HN said. So you don't have to jump around different files, which may get you to lose your state of programming flow. Content/code should even be editable then and there, and simply stored back to the files where they reside. Inspired by supermancho @ HN Lisp IDE's, and TailwindCSS. Content/code (and navigating it) should be free'd from file boundaries (see also: content-addressable code). Inspired by Git. - Well-documented. Documentation on language syntax should be accessible from the editor/IDE, via. the LSP. - Docs should be versioned, so that docs for old versions never disappear, either from the web or from the IDE integration. Inspired by ReScript, and counter-inspired by Emotion (CSS-in-JS). - No @decorators. Counter-inspired by Angular and NestJS. Decorators feel like “magic” that make the runtime control-flow unobvious. I don’t like macro-expansions, either. I don’t want to talk/instruct the compiler. I want the compiler to understand how I write my program. Even if that limits the ways in which I can write my program. Rather than allowing a wide range of styles, and then having to decorate certain styles ad-hoc, to disambiguate them. I don’t like the aesthetics of decorators either. - Well-tested. - Open-source, team: Developed as open-source from the start, of course. By more than 1 hero programmer (see: bus factor). Preferably 4-5 people working in tandem. - Big decisions via public RFC's (Request For Comments). - All the while, the language should avoid the fate of the Vasa.😂 Which means a feature creep resistant _core language_. (I am aware the irony of this feature list, but read on...) Which should be designed and decided upon as early as possible (when the degrees of freedom in the design space is as wide as possible), with a holistic view. Boring > clever. Designed to reach a 80% sweet spot of most important features, foregoing the most exotic and esoteric features, and foregoing the ability to solve edge-cases (such should be relegated to interoperability with other more specialized programming languages). Since 80% of the work and complexity would come from the last 20% (The Pareto Principle). GOAL: REDUCE COMPLEXITY FOR APP DEVELOPERS, BY ABSTRACTION AND WISE PLATFORM DEFAULTS. This ties back to the principle that: - A language determines WHAT you can & have to think about, but also HOW you have to think about it. And the desired features that the language should be: - Designed for fast onboarding of complete beginners (as opposed to catering to a specific language community who already have the curse of expertise). - Very high level. Abstract as much of the details as possible. Abstraction means to "draw away" the concrete nature of things, so that their commonalities remain. - Have a small Meta-Language. For onboarding with low overhead. Counter-inspired by BuckleScript. - Simple, with a well thought out vocabulary. Inspired by Clojure (except cons and conj, which are too similar-looking). Most languages presume the app developer will deal with a lot of relatively low level concerns, to get pretty obvious benefits we _should_ be able to take for granted (e.g. concurrency and parallelism). The sentiment "I don't know, I don't wanna know", as Rich Hickey put it, applies to this. However, it does not mean hostility towards learning, but a certain amount of healthy scepticism: if you have to document something to a large degree, have you really simplified it enough? _"Everything should be made as simple as possible, but not simpler."_, accordig to Occam, Sessions, and Einstein. Too much documentation (i.e. meta-language) is a code smell, since code should be self-documenting. A language's complexity consist of it's syntax and semantics, but also its meta-language (which should be minimised). a language should not burden the speaker/thinker with unnecessary complexity, as that cognitive effort is better spent on the task at hand: the domain is often complex enough in itself! We shouldn't invent problems to solve, even if the solutions could be beautiful. Here is the start of a non-exhaustive list of what the application developer should have to be concerned with, i.e. what the language should afford as syntax and semantics when I'm coding (which does not exclude how the language libraries/runtime implements it under-the-hood): ----------------------------------------------- What I want to think I don't want to know or about: think about: ----------------------- ----------------------- Splitting up the Concurrency vs. problem/data into Parallelism, separate pieces. goroutines, threads, Fibers, Actors, Channels, processes, CPU cores, Microservices, Distributed system topology, Mutexes, Locking, ... Function composition. Monads, Monoids, Class structure, Contextual precoditions, Inheritance rules, Type / Category Theory, ... Choosing appropriate Immutability, Mutable algorithms and data vs. immutable structures. references, Pointers, State management, Data-flow architecture, Complex types, type inference/conversion, ... Expressing what should Pointers, Memory go together, management, Stack vs. co-location of code. Heap allocation, Fundamental security measures, Sadboxing, Scopes, Closures, ... How to organize code to Performance, ... communicate the ideas better to the reader, how to conform to conventions. What should be done, When it should be done and to a limited extent (sync vs. async, eager also How it should be vs. lazy). Where it done should be done (runtime environment-concerns, but also if the work should be done on another CPU-core or another machine) ... The User Experience Anything that detracts (UX) and the actual my thoughts from the UX problem domain! and the actual problem domain. (everything above in this column) ... ... ----------------------------------------------- YOU SHOULD SIMPLY BE ABLE TO DESCRIBE TO THE MACHINE WHAT THE PROBLEM LOOKS LIKE, HOW IT COULD BE DIVIDED UP, AND HOW (THE ALGORITHM) TO SOLVE IT. Then the machine (i.e. language runtime) should decide out _when_ and _where_ it wants to solve it (based on it's hardware/environmental constraints): whether it means to single- or multi-thread parts of the work, or, in case the local resources are/become strained, if it should distribute the work over multiple machines (depending on the measured latency of their inter-network connection). So the language should have an ADAPTIVE RUNTIME, but in lieu of that, it should at least have a platform-configurable compiler, that could make some generally and universally applied decisions based on configuration of specific platform constraints. I want a _capable_ language runtime/platform, so I can write _lean_ programs. Lean, as in: not loaded with what _ought to be_ low-level concerns. THE END One or more of these requirements might be conflicting / mutually exclusive. Maybe. But maybe not? One can always dream. This is a list of my preferences. Some would probably be quite controversial. Like my aversion to certain features which a lot of other people like (e.g. meta-programming). I might just not be familiar enough with them to have developed an appreciation for them. I will try to keep this list updated if and when I change my mind on any point, which I am open to doing. I have already changed my mind from negative to positive on generics and pattern-matching. What features (or lacking features) would your dream programming language have?
Gluteus as a rare localization of extragonadal teratoma
Davide Leardini
Sara Cerasi

Davide Leardini

and 5 more

February 16, 2022
GLUTEUS AS A RARE LOCALIZATION OF EXTRAGONADAL TERATOMAAuthorship: Leardini Davide 1, Cerasi Sara1, Cantarini Maria Elena 1, Facchini Elena 1, Prete Arcangelo 1, Masetti Riccardo 11 Pediatric Oncology and Hematology Unit ”Lalla Seràgnoli”, Istituto di Ricovero e Cura a Carattere Scientifico (IRCCS), Azienda Ospedaliero-Universitaria di Bologna, Bologna, ItalyCorrespondence: Sara Cerasi; address: via G. Massarenti 11, 40138, Bologna (BO), Italy; phone: +39 051 2144665; e-mail:sara.cerasi@studio.unibo.itWord Count for Main Text: 547 wordsAttached to this manuscript there is 1 figureKeywords: extragonadal teratoma, gluteal mass, immature teratomaRunning title: Rare non-midline teratomaTo the Editor,Teratomas represent the most common germ cell tumors in children1. They can be gonadal, more common in adolescents, or extragonadal, primarily in neonates and young children. Teratomas develop from totipotent primordial cells and may originate anywhere along the midline. Common sites for extragonadal teratomas are the sacrococcygeal region, which accounts for 35-60% of all teratomas, the mediastinum, the retroperitoneum, the head and neck and the central nervous system2–4. Other localizations are rare, especially non-midline ones that are very often lateralized expansions of midline teratomas, such as those arising from sacrococcygeal region.We here describe the case of a newborn girl presenting with a gluteal mass, that revealed to be a primary extragonadal teratoma. At birth she presented with a hard-elastic, mobile and painless mass localized within the right gluteus (Fig.1), that had not been noted on prenatal ultrasound. At two days of life, an echography was performed, revealing a subcutaneous irregularly hypoechoic mass with fluid areas inside and small vessels, with aspecific characteristics. The dimension of the mass was 24x15 mm and rapidly increased in size, reaching 40x25mm, and in the number of fluid areas (Fig. 1). Alfafetoprotein serum concentration resulted 4586 ng/mL (refence value at the 2 week-1 month interval at which she was tested 316-6310 ng/mL) and hCG was 0,8 UI/L (normal value <5 UI/L)5. After performing an MRI that excluded other lesions, the mass was removed and a biopsy was performed, revealing an immature teratoma, grade 3 according to Norris’s classification. Since extragonadal teratomas out of the midline are very rare, she has been followed up thoroughly for 3 years with regular periodic blood tests and radiological assessments, but no other primary lesions or recurrencies were found.Teratomas can be malignant (12-14%) or benign, further divided into mature (50-60%) and immature (18-34%). Immature teratomas contain fetal tissue, most often neuroectodermal, the amount of which is scored according to a grading system introduced by Norris. Grade 3 is that with most neuroectodermal tissue, and have an increased incidence of local recurrence and malignant degeneration3,4,6. Complete and prompt surgical resection is the gold standard for definitive therapy in benign teratomas, both mature and immature4,6.Teratomas develop along the midline because they originate from the incomplete differentiation of totipotent primordial cells that arise in the yolk sac and migrate along the mesentery to the gonadal ridge during the 4th-5th week of embryologic development3,4. Indeed, most of the gluteal teratomas reported in literature are lateralized sacrococcygeal teratomas with a connection to the coccyx, since sacrococcygeal teratomas are thought to be derived from totipotent cells of the Hensen’s node (primitive knot), an area at the cranial end of the primitive streak7,6.Other authors reported rare sites for lateralized teratoma development such as kidney, liver and temporozygomatic region8,9. Rare lateralized extragonadal localizations should not mislead the clinical suspicion of teratomas, and a primary localization should always be excluded. Taking also the patient’s age into account, it can sometimes be considered to perform a PET scan. To the best of our knowledge, there is just one case in literature of a gluteal teratoma not in connection with the coccyx, as in our patient, thus confirming the possibility of this very rare localization10. The biological mechanism for germ-cell migration in such anatomical regions is still to be elucidated.ACKNOWLEDGEMENTS: None.CONFLICT OF INTERESTThe authors declare that there is no conflict of interest.ETHICS STATEMENTWritten informed consent has been obtained from the patient to publish this paper.
Multiscale analysis of monoglyceride oleogels during storage
Kato Rondou
Fien De Witte

Kato Rondou

and 6 more

February 16, 2022
Oleogelation offers the possibility to reduce the saturated fatty acid (SAFA) content while maintaining the desired organoleptic properties. Hereby, SAFA are replaced by other structurants which can create a three-dimensional network that immobilizes the liquid oil. Depending on the type of structurants, different structuring routes are identified. The use of monoglycerides (MAGs) as structurants is a promising approach thanks to their great self-assembling properties. However, implementation into the food industry is still hampered due to insufficient characterization. This research includes a multiscale analysis of two dynamically produced MAG-based oleogels as a function of the storage time (up to 8 weeks). Slight differences in the production process resulted in differences in techno-functional properties between the MAG-based oleogels MO1 and MO2. MO1 consisted of larger crystals, which resulted in a lower rigidity, lower stability and lower oil binding capacity compared to the other oleogel (MO2). On the nanoscale, it was found that the crystal nanoplatelets (CNPs) of MO1 contained a higher number of lamellae compared to the MO2. Additionally, the results obtained with ultra-small angle X-ray scattering indicated a larger equivalent diameter for the CNPs of MO1. As a function of the storage time, both oleogels did not show major structural changes up to 8 weeks of storage.
KBG syndrome: case report of a novel variant ANKR11 gene mutation and literature revi...
Sestito Simona
Petrisano Mirella

Sestito Simona

and 5 more

February 16, 2022
KBG is a rare genetic disorder that occurs from the ANKRD11 gene mutation. The clinical phenotype of the patients is characterized by dysmorphic features, short stature, intellectual disability and global developmental delay, but may be not present at the first examination. We describe a novel variant ANKRD11 gene mutation.
Repair of double orifice mitral valve with an atrioventricular septal defect in a gir...
Alwaleed Al-Dairy
Samir Srour

Alwaleed Al-Dairy

and 3 more

February 16, 2022
Ellis--van Creveld syndrome is a rare autosomal recessive disorder. We describe the case of a 7-year-old girl with Ellis--van Creveld Syndrome with the diagnosis of common atrium and partial atrioventricular septal defect. She underwent a successful surgical repair, and intraoperatively, a double orifice mitral valve was diagnosed as well.
Evaluation and improvement of soil water characteristic curves through in-situ monito...
Pingnan Zhang
Chuanhai Wang

Pingnan Zhang

and 6 more

February 16, 2022
In the agricultural area of humid plains, soil water migration and exchange events are frequent and disruptive. Studies on the soil water characteristic curve (SWCC) of unsaturated zones show they are significantly impacted by regional water cycles. In this study, a variety of hydrological data parameters such as rainfall, evaporation, soil water content, and groundwater level were continuously collected for Jintan experimental site in the plains of Taihu Basin, China. The observation results show that the soil water content changes drastically in the flood season every year, with the occurrence of obvious absorption and desorption processes. In terms of soil water content, the soil layer corresponding to a depth of 0-40 cm below the ground is the zone that is most frequently and severely altered. The SWCC based on field data was obtained through numerical inversion of soil water characteristic parameters in a numerical code (HYDRUS-1D). Compared with those measured in the laboratory, the field measured curves in each depth range are more consistent with the characteristics of silty clay loam. The SWCC from field data and the laboratory data were each applied to simulate soil water content for different depths in precipitation events. The simulation results based on the field data showed significantly better correspondence than the results of the laboratory simulation and were more consistent with the changes in soil water content measured in the field.
A case of retroperitoneal abscess caused by infection of urachal remnant
Akira Yoneda
Taiji Hida

Akira Yoneda

and 9 more

February 16, 2022
Infection of urachal remnant may cause recurrent abscesses. In the current case report, we describe an urachal remnant infection leading to a retroperitoneal abscess, which is extremely rare condition. In such cases, the recommended treatment is urachal remnant resection.
Recombinant Limosilactobacillus (Lactobacillus) delivering nanobodies against Clostri...
Dharanesh Gangaiah
Valerie Ryan

Dharanesh Gangaiah

and 8 more

February 16, 2022
Necrotic enteritis (NE), caused by Clostridium perfringens, is an intestinal disease with devastating economic losses to the poultry industry. NE is a complex disease and predisposing factors that compromise gut integrity are required to facilitate C. perfringens proliferation and toxin production. NE is also characterized by drastic shifts in gut microbiota; C. perfringens is negatively correlated with Lactobacilli. Vaccines are only partially effective against NE and antibiotics suffer from the concern of resistance development. These strategies address only some aspects of NE pathogenesis. Thus, there is an urgent need for alternative strategies that address multiple aspects of NE biology. Here, we developed Limosilactobacillus (Lactobacillus) reuteri vectors for in situ delivery of nanobodies against NetB and α toxin, two key toxins associated with NE pathophysiology. We generated nanobodies and showed that these nanobodies neutralize NetB and α toxin. We selected L. reuteri vector strains with intrinsic benefits and demonstrated that these strains inhibit C. perfringens and secrete over 130 metabolites, some of which play a key role in maintaining gut health. Recombinant L. reuteri strains efficiently secreted nanobodies and these nanobodies neutralized NetB. The recombinant strains were genetically and phenotypically stable over 480 generations and showed persistent colonization in chickens. A two-dose in ovo and drinking water administration of recombinant L. reuteri strains protected chickens from NE-associated mortality. These results provide proof-of-concept data for using L. reuteri as a live vector for delivery of nanobodies with broad applicability to other targets and highlight the potential synergistic effects of vector strains and nanobodies for addressing complex diseases such as NE.
Pigmented and polypoid tumor of the pubis
Mariem Rekik
Khadija Sellami

Mariem Rekik

and 4 more

February 16, 2022
Seborrheic keratosis is a common benign epidermal tumor occurring in patients aged over 50 years. It is located preferentially in the trunk, head and neck. The genital location is rare. We report a case of 59 year-old-men presenting a seborrheic keratosis of the pubis.
← Previous 1 2 … 1955 1956 1957 1958 1959 1960 1961 1962 1963 … 2754 2755 Next →

| Powered by Authorea.com

  • Home