AUTHOREA
Log in Sign Up Browse Preprints
LOG IN SIGN UP

Preprints

Explore 66,105 preprints on the Authorea Preprint Repository

A preprint on Authorea can be a complete scientific manuscript submitted to a journal, an essay, a whitepaper, or a blog post. Preprints on Authorea can contain datasets, code, figures, interactive visualizations and computational notebooks.
Read more about preprints.

Autonomous Self-Healing Datacenter Networks: A Unified AI System for Prediction, Dete...
Dheeraj Ramasahayam

Dheeraj Ramasahayam

March 20, 2026
Modern datacenter networks still operate through fragmented workflows in which predictive maintenance, intrusion detection, root cause analysis (RCA), and remediation are studied separately and deployed through loosely coupled tooling. This paper presents a unified AI system for autonomous self-healing datacenter networks that connects four stages: temporal failure prediction, drift-adaptive intrusion detection, topology-aware RCA, and safety-gated recovery with counterfactual validation. The architecture combines streaming telemetry, network-flow analytics, graph reasoning, and a topology digital twin inside a single operational loop. The system is formalized as a constrained sequential decision problem over telemetry, flows, topology, and policy constraints, and is evaluated through staged module validation plus a trace-driven closed-loop emulation. Because no public benchmark spans all four stages jointly, the empirical evidence combines public telemetry and flow datasets, streaming emulation, packet-capture replay, topology-grounded recovery traces, and a synthetic end-to-end incident timeline that makes the module handoff contract explicit. Across failure-prediction benchmarks, the temporal sequence model reaches F1 scores of 0.3737 on optical zero-shot hard-failure evaluation and 0.4677 on Cisco BGP failure prediction within a 60-second warning window. In intrusion detection, the drift-adaptive hybrid improves weighted F1 from 61.35% to 68.69% on full CICIDS2017 cross-dataset transfer without retraining the base detectors and reaches 98.05% weighted F1 in a packet-capture replay case study. For RCA, topology-aware reasoning reaches 0.8380 target-localization F1 with 1.0000 hidden-target accuracy and 0.9394 temporal RCA accuracy at 5.2 s mean detection delay. In the recovery twin, gated actions improve mean reachability from 0.9740 to 1.0000, achieve 0.8182 recovery success, and block 100% of mismatched unsafe actions. The results show that prediction, detection, diagnosis, and remediation can be organized into a reproducible closed loop for next-generation self-healing datacenter networks.
ClosRCA-Bench: An Open Topology-Grounded Benchmark and Counterfactual Recovery Framew...
Dheeraj Ramasahayam

Dheeraj Ramasahayam

March 20, 2026
Open research on datacenter-network root cause analysis (RCA) is limited by two recurring problems: many studies rely on private production traces, and public studies often omit the topology and remediation context needed for self-healing systems. This paper introduces ClosRCA-Bench, a reproducible topology-grounded benchmark constructed from Cisco's public Clos-topology telemetry repository by combining event files, CDP maps, and per-device YANG telemetry into fixed graph windows. The resulting benchmark contains 311 windows over 11 topology nodes with 30 features per node and four cause families: BFD outage, blackhole, ECMP change, and interface shutdown. Two fault classes localize to hidden target devices that are not directly monitored, making topology-aware localization a first-class task. The paper evaluates rule-based, correlation-based, graph-only, and spatio-temporal graph RCA methods, then measures remediation with a safety gate and a counterfactual topology digital twin. On the held-out split, Random Forest achieves the strongest anomaly F1 at 0.9688 and weighted RCA cause F1 at 0.9707. The full STGNN reaches 0.8380 weighted F1 for target-device localization and 1.0000 hidden-target accuracy, while the no-topology ablation collapses to 0.0000 hidden-target accuracy. In temporal tracking, the full STGNN attains 0.9394 RCA accuracy with 5.2 s mean detection delay, improving over the graph-only baseline's 6.3 s delay at the same RCA accuracy. On the compound-failure slice, the full STGNN retains 0.9130 cause accuracy compared with 1.0000 on single-failure windows. In the counterfactual recovery twin, gated actions improve mean reachability from 0.9740 under fault to 1.0000 after recovery and achieve 0.8182 recovery-success rate while blocking 100% of mismatched unsafe actions. The main contribution is therefore an open benchmark and evaluation protocol that makes topology-aware, temporal, and recoveryaware RCA measurable.
Drift-Adaptive Intrusion Detection for Enterprise Networks
Dheeraj Ramasahayam

Dheeraj Ramasahayam

March 20, 2026
Enterprise intrusion detection remains a moving target because traditional rule-based systems are fast but narrow, while learned detectors often report strong in-dataset performance without demonstrating how they adapt to live distribution shift. This paper presents a reproducible benchmark over four evaluation settings: the full official UNSW-NB15 split, the full official NSL-KDD split, the full external CICIDS2017 corpus, and a cleaned CSE-CIC-IDS2018 external corpus. A transparent flow-signature IDS baseline is compared against Random Forest, LSTM, Transformer, a static Drift-Aware Hybrid, and an online Drift-Adaptive Hybrid controller that reweights the ensemble under detected shift. On UNSW-NB15, the static Drift-Aware Hybrid achieves the strongest weighted F1 score of 90.72%, slightly above Random Forest at 90.65%. On NSL-KDD, LSTM achieves the best weighted F1 score of 81.01%. Under cross-dataset transfer into the full CICIDS2017 corpus, LSTM remains best overall with a weighted F1 score of 71.53%, but the online Drift-Adaptive Hybrid improves the static hybrid from 61.35% to 68.69% without retraining the base detectors. A formal driftdetector study comparing Isolation Forest, ADWIN, DDM, and Page-Hinkley finds that Isolation Forest is strongest in this benchmark, reaching 70.58% post-adaptation weighted F1 with zero source-domain false positives. On CSE-CIC-IDS2018, the flow-signature baseline is unexpectedly strongest at 66.37% weighted F1, exposing the limits of source-only learned transfer. Family-level failure analysis on CICIDS2017 further shows that the adaptive hybrid partially recovers FTP-Patator traffic with F1 0.1793 where the source-trained LSTM collapses. Finally, a replayed packet-capture case study over 221,253 real packets from a local service attack trace detects the attack immediately at onset and achieves 98.05% weighted F1 on one-second bidirectional flow windows. The repository also reports latency-under-load, real-time streaming traces, deployment architecture, and explainability-by-ablation artifacts, making the benchmark useful both as an academic paper and as an open-source systems study.
Energy from the depths: how aquatic chemoautotrophy fuels terrestrial food webs in a...
bancila_ralucaioana
Ruxandra Niţescu

Raluca Bancila

and 6 more

March 19, 2026
Trophic transfers from aquatic to terrestrial communities are well-documented in surface ecosystems which are dependent on photosynthesis, yet they remain largely understudied in sulfidic hypogean karst systems sustained by chemoautotrophy. Here, we applied a multi-approach stable-isotope analysis (δ¹³C, δ¹⁵N) to quantify trophic links between sulfidic groundwater and subterranean terrestrial communities in Sulfur Cave (Sarandaporo Valley, Albania-Greece border), tracing energy flow from chemoautotrophic microbial biofilms through dominant aquatic primary consumers (Tanytarsus albisutus, Chironomus riparius, Contacyphon palustris) to terrestrial predators (spiders, centipedes, pseudoscorpions and scorpions). Distinct trophic structures emerged among cave zones, with spatially segregated food webs clearly separated by stable-isotope signatures. Low niche overlap indicates limited sharing of trophic resources and a high degree of compartmentalization within the cave ecosystem. Chemosynthetically produced food was transferred almost exclusively to terrestrial predators via emerging aquatic insects, driving a largely unidirectional energy flow. A top predator showed zone-specific reliance on emerging aquatic insects indicating that aquatic subsidies cause predators to switch prey, depending on prey availability and location within the cave. Overall, this study challenges long-standing paradigms in subterranean ecology derived largely from epigenic karstic caves and sparsely studied sulfidic systems. Thus, we highlight the need to reconsider generalizations drawn from non-sulfidic karst systems when interpreting energy flow and ecosystem functioning in chemically driven subterranean systems.
Securing LLM Software Supply Chains: A Layered Lifecycle Framework with Hybrid Post-Q...
Rafael Silva
Luiz Duarte

Rafael Silva

and 1 more

March 23, 2026
Large Language Models (LLMs) are increasingly embedded in production systems, creating complex development and deployment pipelines that resemble modern software supply chains. These pipelines span dataset ingestion, training workflows, model artifacts, distribution registries, inference infrastructure, and downstream application integration. While prior work has extensively examined model-centric threats such as extraction, inversion, and prompt injection, the security of the complete LLM lifecycle remains insufficiently addressed.Existing software supply chain frameworks, including Sigstore, in-toto, Supply Chain Levels for Software Artifacts (SLSA), and The Update Framework (TUF), provide strong guarantees for artifact integrity and provenance. However, these mechanisms were designed primarily for conventional software artifacts and do not fully capture AI-specific lifecycle threats such as data poisoning, training pipeline compromise, and retrieval-augmented prompt manipulation. In addition, most artifact-signing infrastructures rely on classical cryptography that may face long term risk from quantum adversaries.This paper proposes a layered security framework for LLM software supply chains. The framework models LLM security as a lifecycle control problem spanning five layers: data, training, model, inference, and application. For each layer, the framework identifies representative assets, threat classes, and defensive controls. The paper further analyzes the role of hybrid post-quantum signatures combining classical digital signatures with post-quantum schemes such as ML Large Language Models,AI Supply Chain Security,Post-Quantum Cryptography,Hybrid Signatures,Model Integrity,Software Supply Chain Security,Secure ML LifecycleDSA to strengthen artifact integrity at the model distribution stage.The proposed framework provides a structured model for analyzing threat propagation across the LLM lifecycle while remaining compatible with established supply chain technologies. We contribute a formal threat taxonomy, a comparative analysis against existing frameworks, and a qualitative evaluation demonstrating why artifact integrity alone is insufficient for securing LLM supply chains.
Deriving Maxwell's Equations and the Fine-Structure Constant from the Rotor Field Equ...
Stephen Euin Cobb

Stephen Euin Cobb

March 20, 2026
Maxwell's equations and the fine-structure constant are central elements of electromagnetic theory, yet in conventional physics they enter as postulated field relations and experimentally determined parameters rather than quantities derived from deeper geometric principles. In this paper we show that both arise naturally within the Rotor Dynamics Framework, in which matter and radiation are interpreted as manifestations of curvature circulation in a four-dimensional rotor manifold. Beginning with the rotor field equation governing curvature evolution, we derive a continuity law expressing the conservation of curvature flux in the vacuum manifold. When the four-dimensional curvature vector is projected into the observable three-dimensional spatial domain, this conservation law yields the full set of Maxwell equations, identifying the electric and magnetic fields as complementary projections of a single underlying curvature structure. Electromagnetic waves then appear as propagating oscillations of rotor curvature, corresponding to open rotor modes that transport curvature phase through the vacuum continuum. Within the same framework the fine-structure constant emerges as a geometric compatibility ratio governing curvature exchange between rotor domains, particularly between the electron and proton structures. The resulting interpretation places both the equations of electromagnetism and their coupling constant within a unified geometric description in which particles correspond to closed rotor configurations and radiation corresponds to propagating curvature disturbances. These results extend the mathematical formalism of the Rotor Dynamics Framework and suggest that electromagnetic phenomena may be understood as large-scale manifestations of curvature transport within a single underlying rotor manifold.
Dose--Response Relationships Between Nitrous Oxide Exposure, Biomarkers, and Neurolog...
Laxna Bhujel
Angela L. Chiew

Laxna Bhujel

and 6 more

March 19, 2026
Aims: Recreational nitrous oxide (N 2O) misuse is a growing problem. It can cause functional hydroxycobalamin (B 12) deficiency and severe neurological complications. Despite increasing harms, the dose-response relationship and clinical value of biomarkers remain unclear. This study aimed to characterise the demographics, patterns of use, outcomes, and biomarker utility in patients hospitalised with N 2O toxicity, and to examine whether a dose-response relationship exists between exposure and toxicity. Methods: This was a retrospective cohort study of hospitalised individuals with N 2O toxicity between 1 January 2020 and 31 July 2025 across six Sydney hospitals. Weekly and cumulative N 2O exposure volumes were estimated. Exposure volumes were compared with clinical outcomes and biomarker results to assess correlations and biomarker utility. Results: Eighty-one individuals across 100 presentations were included. Higher cumulative volumes of N 2O exposure were associated with worse neurological impairment, including subacute combined degeneration of the spinal cord (SCD)( p=0.006) and peripheral neuropathy (PN)( p=0.036), and showed moderate correlation with greater neuropathy severity (Spearman’s ⍴=0.43, p=0.001). Median homocysteine was 50 μmol/L(IQR: 32-114; normal 5-15) with a median half-life of 14.59 hours. Median methylmalonic acid was 0.68 μmol/L (IQR: 0.18-3.16; normal <0.27). Regarding diagnosis of SCD, B 12 and holotranscobalamin had high specificity (87% and 98%) but low sensitivity (22% and 5%); while homocysteine and methylmalonic acid were more sensitive (95% and 93%) but less specific (13% and 45%). Conclusions: Cumulative N 2O exposure is the strongest predictor of severe neurological outcomes in a dose-dependent manner. Homocysteine and methylmalonic acid are good screening tests for neurotoxicity but poor confirmatory tests.
Towards a continental-scale biodiversity monitoring network to assess climate-driven...
Francesco Belluardo

Francesco Belluardo

and 8 more

March 20, 2026
As impacts of climate change on biodiversity are increasing globally, decision-makers urgently need scientifically robust, long-term monitoring data on biodiversity responses to inform effective conservation strategies. We designed a European-scale monitoring network to assess climate change effects on bat communities, integrating Species Distribution Models under current and future climatic scenarios with Spatial Conservation Planning tools to locate suitable monitoring site locations across the continent. The monitoring design followed a hypothesis-driven framework, defining projected species range shifts and community turnover as measurable indicators within the Essential Biodiversity Variables framework, and applying a statistically robust, standardized approach to assess climate-driven changes. The optimized site selection also incorporated practical implementation aspects, including workforce availability and variable levels of monitoring resources and efforts. In addition, we applied the monitoring network framework to a regional case study from Catalonia (Spain) that illustrates how existing national or subnational programs not originally designed for climate research can be easily adapted to detect climate impacts on regional bat communities, thereby providing a foundation for a future European-scale network built upon ongoing monitoring initiatives. The proposed continental network can support the European Union in meeting international commitments on biodiversity conservation and climate policy, while assisting Member States in fulfilling reporting obligations and implementing key policy instruments such as the Habitats Directive, Biodiversity Strategy for 2030, and Nature Restoration Regulation. The proposed framework offers substantial flexibility and scalability across both spatial and taxonomic dimensions and, although developed for bats, it is designed to be broadly transferable across geographic regions and taxonomic groups, providing a generalizable approach for climate-informed biodiversity monitoring.
Invariant Structure in Deterministic Decoder Transformation Pipelines
Trent Slade

Trent Slade

March 20, 2026
We present a deterministic framework for discovering, validating, and falsifying structural invariants in decoder transformation pipelines. Rather than optimizing convergence speed or heuristic performance, this work focuses on identifying algebraic and dynamical properties that remain invariant under repeated application of transformation operators. Using a fully deterministic experimental pipeline, we demonstrate that several non-trivial invariants-such as ternary closure, idempotence classes, and dark-state cascade behavior-can be formally proven against the actual implementation. Crucially, we complement positive results with adversarial counterexamples, showing that commonly assumed properties (e.g., universal absorbing states, global idempotence under damping) do not hold in general. This establishes a new methodology for invariant-driven analysis of message-passing systems, shifting emphasis from optimization toward structural understanding and computation elimination.
A Structural Theory of Symbolic Objects
Sergey Kotikov

Sergey Kotikov

March 20, 2026
This article develops a structural theory of symbolic objects in which syntactic presentation, semantic interpretation, and admissible transformation are treated within a common formal framework. The central idea is that many mathematical and computational entities admit a representation not only by canonical values, but also by structured symbolic realizations that retain compositional organization and transformation history. To formalize this viewpoint, we introduce symbolic structures, define their semantic quotients, and organize structurepreserving transformations in a category denoted by SEq. The framework is illustrated first in the elementary setting of integer-valued symbolic words and is then extended to graphs, weighted symbolic models, logical systems, and operators on structured state spaces. The resulting theory provides a general language for provenance-aware symbolic mathematics and compositional semantics.
Emergence of Newtonian Gravity and Lepton Mass Hierarchy from String Rewriting System...
Sergey Kotikov

Sergey Kotikov

March 20, 2026
We present a computational framework demonstrating that fundamental physical laws can emerge from discrete string rewriting systems defined on graphs with power-law connectivity. By constructing onedimensional lattices augmented with long-range connections following a probability distribution P (d) ∼ d −α , we solve the Poisson equation on the resulting graph Laplacian to derive an effective gravitational potential. Through systematic grid search over the parameter space (α, N), we identify α = 2.2 as producing a gravitational force law F ∼ r −2.003 , deviating from Newtonian gravity by merely 0.34%. Furthermore, we model three generations of leptons using string rewriting rules of lengths L = 3, 5, 7, deriving a two-parameter mass formula that reproduces the experimental mass ratios m µ /m e = 206.8 and m τ /m µ = 16.8 with zero error. The framework suggests a unified geometric origin for both gravitational and particle physics phenomena through a single structural parameter α ≈ 2.
Structural and Epistemological Interpretation of High-Redshift Point Sources Recons...
hong seok houn

hong seok houn

March 20, 2026
Recent JWST observations have revealed numerous compact, high-luminosity sources at high redshift that often appear unresolved or point-like, yet exhibit luminosities comparable to small or intermediate galaxies. Interpreting these objects as fully formed early galaxies leads to recurring tensions, including extreme stellar mass densities, unrealistically short formation timescales, and unclear evolutionary continuity toward lower redshift populations. This work proposes that such tensions may not require new physics, but instead arise from an epistemological bias toward interpreting astronomical systems as fixed objects rather than transient phases. In particular, present-day globular clusters are observed as old and faint remnants, which may obscure the possibility that their formation phases were short-lived but extremely luminous. We propose that a subset of high-redshift point sources are not single galaxies, but spatially unresolved assemblies of multiple proto–globular clusters (proto–GCs) formed over a finite time spread. In this framework, the observed spectral energy distribution (SED) must exhibit intrinsic age-mixed signatures, reflecting the coexistence of young, intermediate, and evolved stellar populations. This study presents a phase-based reinterpretation of such sources and provides explicit testable predictions and falsifiability criteria to distinguish this model from conventional galaxy interpretations.
Stereospecific Dehydroxytrifluoromethoxylation of Alcohols with 2,4-Dinitro(trifluoro...
You Peng
Haijin Guo

You Peng

and 4 more

March 19, 2026
To fully elucidate the function of trifluoromethoxy group in drug discovery and materials science, the efficient preparation of stereodefined analogues is essential. The established dehydroxytrifluoromethoxylation reaction, while synthetically useful, follows an S N1 mechanism and consequently produces racemic products. Herein, we report a stereospecific S N2 dehydroxytrifluoromethoxylation reaction. This protocol employs readily available 2,4-dinitro(trifluoromethoxy)benzene (DNTFB) as a trifluoromethoxide source and fluoro- N,N,N’,N’-tetramethylformamidinium hexafluorophosphate (TFFH) as a dual activator. TFFH activates the alcohol for isouronium formation and provides fluoride to liberate the CF 3O - anion from DNTFB. This stereospecific method offers a highly efficient, low-cost, and broadly applicable strategy for the late-stage modification of complex drug molecules.
Dark-State Invariants: A Deterministic Framework for Detecting Computational Redundan...
Trent Slade

Trent Slade

March 20, 2026
We introduce Dark-State Invariants, a deterministic framework for identifying temporally stable regions in belief propagation (BP) dynamics. Rather than optimizing update rules or convergence speed directly, this approach measures where iterative message-passing has effectively ceased to produce meaningful changes. A node is classified as dark-stable when both its sign and magnitude remain invariant within a fixed tolerance across consecutive iterations. This yields a per-iteration mask and derived metrics quantifying the fraction of computationally inactive nodes. Dark-state invariants provide a new observable complementary to existing measures such as cosine fidelity and sign agreement. Empirical evaluation across deterministic stress scenarios demonstrates that convergent systems rapidly accumulate high dark-state fractions, while oscillatory and chaotic systems do not. This enables a principled shift from acceleration-based optimization toward invariantdriven computation elimination, where redundant updates can be safely avoided.
Decoupled Space-Time Parallel Solver Integrating Parareal and MATE for Scalable Trans...
Francis C Joseph

Francis C Joseph

and 1 more

March 20, 2026
This paper presents a decoupled space-time parallel solver that integrates the Parareal algorithm (parallelin-time) with the Multi-Area Thevenin Equivalent (MATE) method (parallel-in-space) for simulating the transient stability of large-scale power systems. A shared-memory implementation of MATE, exploiting both spatial and task-level parallelism, is incorporated into the Master-Worker (PM) and Distributed (PD) Parareal paradigms. The hybrid solver concurrently employs two high-performance computing (HPC) frameworks-OpenMP for MATE and message passing interface (MPI) for Parareal-to achieve scalability across both domains. Two implementation strategies are examined: homogeneous configurations with equal MATE partitions and heterogeneous configurations with unequal partitions. Simulation results on large systems with detailed generator and composite load models demonstrate that homogeneous scheduling complements Parareal, achieving nearly linear speedup while preserving MATE's expected performance. Consequently, the overall speedup of the hybrid solver approximates the product of the individual MATE and Parareal speedups. Heterogeneous scheduling offers performance benefits when uniform resource allocation is infeasible, allowing for flexible processor deployment across diverse computing environments.
Non-fungible Tokens (NFTs) in Diagnostic Imaging
Andrew Kim

Andrew Kim

and 3 more

March 20, 2026
Non-fungible Tokens (NFTs) in Diagnostic ImagingAndrew Kim1, Jarrett Bobrin1, David Weinstein1, Isabelle G. Kim.Temple University Hospital1, Department of Radiology, Philadelphia, PA.Non-fungible tokens (NFTs) have garnered significant media attention in recent years, largely due to the astronomical prices fetched by some digital artworks. They have emerged as a popular medium for buying and selling digital art. Most people associate NFTs with high-profile examples such as the Bored Ape Yacht Club or Beeple’s digital artwork, the latter of which famously sold for over $69 million. Even the world’s first SMS text message was converted into an NFT and sold for over 100,000 euros. In 2021, the NFT market was valued at approximately $41 billion USD, and the term “NFT” ranked among the most popular search terms on Google during both 2021 and early 2022.However, NFTs are more than just digital collectibles; they hold significant untapped potential, particularly in the medical field, including diagnostic imaging. While blockchain technology has been widely explored in healthcare, the specific role of NFTs in diagnostic imaging remains largely unexplored. Although there has been extensive discussion on the use of blockchain in medicine, the application of NFTs in this space is still in its infancy.So, what exactly is an NFT? A non-fungible token is a unique digital asset representing ownership of a specific item or piece of data—whether that be digital artwork, music, or in more recent applications, items in video games or medical records. NFTs are built using the same blockchain technology as cryptocurrencies like Ethereum. However, unlike cryptocurrencies or fiat currencies, NFTs are non-fungible, meaning they are not interchangeable, and each holds a distinct value. Both NFTs and cryptocurrencies rely on blockchain transactions to validate authenticity and ownership. NFTs serve as a digital certificate of ownership, and each time an NFT changes hands, the transaction is recorded on the blockchain decentralized, public ledger.NFTs also incorporate smart contract technology, which is particularly relevant to the field of medicine. For instance, in the art world, the original artist may receive royalty every time their artwork is resold. This same mechanism can be applied to healthcare data, offering both security and potential financial benefits to patients.In the U.S., it is estimated that each patient generates approximately 80 megabytes of health data annually. Utilizing NFTs to manage medical data would allow patients to enhance the confidentiality of their personal health information. Through smart contracts, patients could control and define who has access to their data—whether it’s their primary care physician, an emergency room doctor, a radiologist, or a specialist at a cancer center. Once recorded on a public, decentralized blockchain, this data becomes immutable and highly secure, preventing tampering or unauthorized access.This model empowers patients and shifts control away from commercial or nonprofit institutions that often manage and monetize patient data without individual input. As Dr. Kristin Kostick-Quenet has pointed out, once health information is digitized, it typically falls out of the patient’s control and is commodified by companies for profit. NFTs offer a solution: patients could maintain ownership over their data and even receive financial compensation when it is accessed or utilized.The digital contracts associated with NFTs also allow patients to trace the use of their data—who accessed it, when, how, and why. According to an article from Cointelegraph, the healthcare platform Aimedis plans to tokenize anonymized patient data into NFTs, which can then be sold to pharmaceutical companies. In return, patients may receive revenue from the sales of these NFT tokens. However, a key challenge remains, healthcare IT systems are currently fragmented and not yet optimized for this level of integration. In an ideal future, patients would use a single login interface to manage all their health data.Importantly, NFTs can enhance the quality and accessibility of medical data, making it more suitable for artificial intelligence applications and data mining. Aimedis aims to revolutionize global exchange and monetize de-identified health data using blockchain and NFT technologies.NFTs also have direct applications in radiology. Patients could predefine which radiologists or physicians can access their imaging studies and reports. They could also track who views their data and under what circumstances. If their imaging is later sold or used by a commercial entity—such as a medical center or pharmaceutical company—for research or drug development, the patient could receive royalty payments each time it is used. For example, if a cancer patient undergoes a PET/CT scan and the resulting data is converted into an NFT, a pharmaceutical company using that data in drug research might owe compensation to the patient.Moreover, NFTs could enhance the information available to radiologists. For example, they could include important historical details, such as previous reactions to gadolinium contrast, a history of renal insufficiency, or retained metal that could affect MRI compatibility. Such centralized and accessible data would aid in ensuring patient safety and improving diagnostic accuracy.With the rise of telemedicine, NFTs could also play a key role in verifying transactions between the physical and digital healthcare environments. For example, a doctor’s prescription or imaging order could be tokenized, eliminating any ambiguity regarding its origin or intent. In radiology, this could clarify whether a referring physician wanted a CT scan with or without contrast or preferred a two-view chest X-ray over a portable study—ultimately improving communication between referring clinicians and radiology departments.Teleradiology images could also be tokenized, giving patients visibility over who has accessed their reports and to whom results were sent. In addition, NFTs could be used to verify the credentials of radiologists, such as medical degrees and certifications. Since this information would be recorded on an immutable blockchain, it would be secure and tamper-proof. This could reduce administrative burdens, such as those placed on radiology file rooms by repeated requests for copies of reports or credentials.Tokenized radiology data may also serve as a valuable audit trail, allowing radiologists to confirm that their reports were viewed and used appropriately by referring clinicians.While numerous challenges remain, including legal considerations, government regulations, and the environmental impact of blockchain technology, NFTs are poised to play a growing role in healthcare. Diagnostic imaging, often at the forefront of technological innovation in medicine, is well positioned to benefit from the adoption of blockchain-based NFT applications.References:Conti, R. (2022, August 16). What is an NFT? non-fungible tokens explained. Forbes. Retrieved August 29, 2022, from https://www.forbes.com/advisor/investing/cryptocurrency/nft-non-fungible-token/Culbertson, N. (2021, August 6). Council post: The Skyrocketing Volume of Healthcare Data Makes Privacy Imperative. Forbes. Retrieved August 29, 2022, from https://www.forbes.com/sites/forbestechcouncil/2021/08/06/the-skyrocketing-volume-of-healthcare-data-makes-privacy-imperative/?sh=327ba8536555Diaz, N. (n.d.). What nfts need to achieve before healthcare implementation. Becker’s Hospital Review. Retrieved August 29, 2022, from https://www.beckershospitalreview.com/healthcare-information-technology/what-nfts-need-to-achieve-before-healthcare-implementation.htmlHarrison, S. (2022, April 13). Some medical ethicists endorse nfts-here’s why. Scientific American. Retrieved August 29, 2022, from https://www.scientificamerican.com/article/some-medical-ethicists-endorse-nfts-heres-why/HHMGlobal, C. T. (2022, April 11). Content team HHMGlobal. HHM Global B2B Online Platform Magazine. Retrieved August 29, 2022, from https://www.hhmglobal.com/knowledge-bank/news/can-nfts-be-repurposed-for-the-healthcare-industryJones, C. (2021, September 13). Why nfts, crypto and blockchain can help e-health thrive. Cointelegraph. Retrieved August 29, 2022, from https://cointelegraph.com/news/why-nfts-crypto-and-blockchain-can-help-e-health-thriveKhatri, N. (2021, December 8). Beyond Trendy Investments: Three applications of nfts in healthcare and Pharma Marketing. PM360. Retrieved August 29, 2022, from https://www.pm360online.com/beyond-trendy-investments-three-applications-of-nfts-in-healthcare-and-pharma-marketing/Kostick-Quenet, K., Mandl, K. D., Minssen, T., Cohen, I. G., Gasser, U., Kohane, I., &amp; McGuire, A. L. (2022). How nfts could transform Health Information Exchange. Science, 375(6580), 500–502. https://doi.org/10.1126/science.abm2004Limited, V. M. P. (n.d.). AIMEDIS announces the NFT Healthcare Platform. Newsfile. Retrieved August 29, 2022, from https://www.newsfilecorp.com/release/103552/Aimedis-Announces-the-NFT-Healthcare-PlatformMcGuire, A. (2022, February 4). Can NFT technology benefit healthcare? in. Retrieved August 29, 2022, from https://healthcare-in-europe.com/en/news/can-nft-technology-benefit-healthcare.htmlShyam Sabat MD, M. B. A. (2021, April 27). Blockchain - promises for academic radiology. LinkedIn. Retrieved August 29, 2022, from https://www.linkedin.com/pulse/blockchain-promises-academic-radiology-shyam-sabat-md-sabat-mdTagliafico AS, Campi C, Bianca B, et al. Blockchain in radiology research and clinical practice: current trends and future directions. La Radiologia Medica. 2022 Apr;127(4):391-397.YouTube. (2021, September 20). How nfts will revolutionize medicine. YouTube. Retrieved August 29, 2022, from https://www.youtube.com/watch?v=TnhmUltTGo
Improved Post-Processing Framework for Whole Slide Imaging and 3D Reconstruction  
Emily Lizeth De Leon Alvarado

Emily Lizeth De Leon Alvarado

and 8 more

March 24, 2026
Whole slide imaging (WSI) enables high-resolution digitization of histological sections, but 3D reconstruction from serial slides remains hindered by illumination artifacts, tile inconsistencies, and section-specific geometric distortions. We present an enhanced reconstruction framework that addresses both tile-level intensity inhomogeneity and instability in deformable registration. First, we introduce an overlap-aware illumination correction that refines an initial flat-field estimate by adding an intensity equality constraint in overlapping tile regions. Second, intensity normalization of sections was performed before reconstruction to ensure consistent intensity profiles among serial sections. Lastly, we incorporate level-set method (LSM) constraints into an existing volumetric reconstruction framework to stabilize alignment.The framework was developed for marmoset brain histology, including within-specimen blockface photography and magnetic resonance imaging (MRI). The LSM reconstruction was validated using a synthetic banana image with a known initial shape, and with mouse histology reconstructed using an external 3D MRI reference. Overlap-aware correction in the marmoset WSI significantly reduced residual gridding artifacts across 112 sections (Friedman χ²(2)=890.04, p<2.2×10⁻¹⁶). Within the marmoset dataset, intensity normalization reduced inter-slice intensity variability (Coefficient of Variation: 0.392 to 0.246) and incorporation of LSM consistently improved registration accuracy by approximately 10% in Normalized Mutual Information scores and 3-5% in Structural Similarity Index Measure. LSM also yielded improved structural correspondence in both gray matter and white matter structures.
PARERGON-137: The Illusion of Empirical Precision A Methodological Reductio ad Absurd...
Heiko Grimberg

Heiko Grimberg

March 20, 2026
This document serves as a forensic demonstration. It documents the construction of a mathematical identity that replicates the CODATA value of the fine-structure constant α −1 with a precision of eleven significant digits. We hereby formally declare that the following derivation is physically meaningless "curve-fitting". Its sole purpose is to prove that numerical accuracy within the framework of prevailing parameter-physics is not an indicator of truth, but merely a measure of mathematical cynicism.
Tuning Cation Disorder in LiNi₀.₅Mn₁.₅O₄ via Room Temperature Continuous Flow Co-prec...
Bangxun Yin
Jack Quayle

Bangxun Yin

and 5 more

March 18, 2026
A room temperature continuous co-precipitation process incorporating in-line dynamic mixing was employed for the synthesis of a precursor (production rates up to 0.42 kg·h⁻¹) to the high-voltage lithium-ion cathode material LiNi₀.₅Mn₁.₅O₄ (LNMO). The co-precipitate was initially lithiated in a tube furnace in air via a two-step heat treatment (500 °C/5 h and 850 °C/12 h). After cooling, a third and final heat treatment at either 650, 700, or 750 °C was used to tune the degree of cation disorder in the final LNMO sample. X-ray Photoelectron Spectroscopy revealed a clear temperature-dependent trend for the surface of the final LNMO sample heat-treated at 650 °C (surface disorder ca. 38%), whilst those heat-treated at 700 and 750 °C showed ca. 46 and 53% surface disorder, respectively. Electrochemical testing of the three final LNMO samples demonstrated the latter material (final 750 °C sample) delivered discharge capacities of 124, 116 and 81 mAh·g⁻¹ at 0.1, 1, and 5C, respectively. Furthermore, incorporation of 2.5 wt % multi-walled carbon nanotubes (MWCNTs) into the electrode formulation significantly enhanced the rate capability, yielding 98 mAh·g⁻¹ at 5 C and 91 mAh·g⁻¹ at 10 C, together with improved cycling stability (up to 98.4 % retention after 100 cycles at 1 C). Overall, the use of a flow process for precursor synthesis provided a scalable and energy-efficient route to spinel-phase LNMO, offering both high performance and tuneable surface cation disorder.
ACR-Drone: Adaptive Contextual Reasoning for Enhanced Self-Iterative Drone Autonomy
Bowen Mu

Bowen Mu

and 1 more

March 20, 2026
Achieving fully autonomous embodied agents, particularly drones, in complex and dynamic real-world environments remains a significant challenge. While Large Language Models (LLMs) have advanced decision-making, existing systems struggle with extreme dynamic changes, instruction ambiguities, or complex failure modes requiring deeper scene understanding. Specifically, the efficiency of self-correction systems relies on intelligently filtering and distilling contextual information for LLMs to provide precise diagnostic evidence. This paper introduces ACR-Drone, an embodied drone autonomous task planning and self-iterative improvement system. ACR-Drone significantly enhances environmental comprehension and self-correction through an Adaptive Contextual Reasoning (ACR) mechanism. It integrates novel components such as dynamic semantic anchor points for initial Behavior Tree (BT) generation, an Adaptive Contextual Monitoring and Execution module featuring Context-Aware Filtering and Predictive State Reasoning, an Embodied Knowledge Base, and Hierarchical BT Refinement with Contextual Constraints for targeted modifications. Rigorous experimental validation, adhering to established task benchmarks, demonstrates that ACR-Drone consistently achieves superior overall task success rates, improved failure diagnosis, enhanced refinement efficiency, and greater robustness in both simulated and real-world environments, without requiring fine-tuning of the underlying LLMs. Our system's proactive detection capabilities and reduced refinement cycle times underscore the profound benefits of adaptive contextual reasoning for advanced drone autonomy.
Deterministic Redundancy Elimination via Trace-Indexed Sign/CRC Reuse in QSOL-IMC QEC...
Trent Slade

Trent Slade

March 18, 2026
This document presents a formally validated optimization within the QSOL-IMC Quantum Error Correction (QEC) framework (v68.5.0). The optimization exploits a data-level invariant in belief propagation (BP) dynamics diagnostics: sign vectors and their derived CRC32 signatures are pure, deterministic functions indexed solely by trace position. This invariant enables elimination of redundant computation across multiple diagnostic metrics through trace-level precomputation and reuse. The invariant is proven via slice-level equivalence, byte-level identity, and purity arguments, and validated with deterministic instrumentation. The full test suite remains unchanged and passes completely. The optimization reduces redundant sign computations by 75.3% within BP dynamics analysis while preserving bitwise identity, convergence behavior, and reproducibility guarantees. Scope. This document covers invariant validation and diagnostic-layer optimization only. No algorithmic changes to decoding behavior are introduced.
Fractal Phase-Space Uncertainty: A Logical-Spectral Reformulation of the Heisenberg P...
Enrique Vidal Silvente

Enrique Vidal Silvente

March 18, 2026
We develop a fractal-spectral reformulation of the Heisenberg uncertainty principle within a nonlinear phase-space framework motivated by the Quantum Fractal-Logical Unified Field architecture. Instead of working on the standard Hilbert space L 2 (R) with Lebesgue measure and linear position/momentum operators, we consider a fractal Hilbert space H F = L 2 (F, µ H) built on a Cantor-type metric-measure substrate with Hausdorff measure µ H and a self-adjoint fractal Hamiltonian. Within this setting we define effective position and momentum observables (X F , P F) adapted to the fractal geometry and derive a generalized uncertainty relation
Identifying Cross-Timescale Coupling in Bering Sea Air-Sea Interactions Using Bispect...
Emily E. Hayden

Emily E. Hayden

and 3 more

March 18, 2026
A document by Emily E. Hayden. Click on the document to view its contents.
Coupled fate of true polar wander, habitability, and planetary climate dynamics: a ge...
Mostafa Kiani Shahvandi

Mostafa Kiani Shahvandi

March 18, 2026
True Polar Wander (TPW) describes the reorientation of a planet's rotational axis relative to its surface and arises from the conservation of angular momentum between the planetary components, including mantle, core, and climate system (atmosphere, ocean, cryosphere, and terrestrial hydrology). TPW has long been hypothesized to influence planetary climates and, consequently, to play a critical role in assessments of planetary habitability. Recently, several studies have developed simplified frameworks to examine the climatic and habitability implications of TPW. Owing to their schematic nature and their primary focus on Earth, these approaches may be inadequate for capturing the full range of TPW dynamics, particularly for planets with more complex geometries and physical characteristics. Here we present a general theoretical framework for the coupled TPW-climate system of an arbitrary planet, incorporating solid and fluid cores, mantle, and surface climate dynamics. We present a few simple examples to show that TPW can lead to modification of planetary snowball states, substantially increasing the global mean temperature in regions confined to certain, obliquity-dependent latitudes, and thus enabling the survival of life. Furthermore, we demonstrate that TPW can affect the initiation of snowballs by modifying the planetary radiative forcing. This theory provides a unified basis for investigating the long-term climate evolution and habitability of planets undergoing TPW and offers a foundation for identifying promising observational targets in the search for life.
← Previous 1 2 … 11 12 13 14 15 16 17 18 19 … 2754 2755 Next →

| Powered by Authorea.com

  • Home