AUTHOREA
Log in Sign Up Browse Preprints
LOG IN SIGN UP

Preprints

Explore 66,105 preprints on the Authorea Preprint Repository

A preprint on Authorea can be a complete scientific manuscript submitted to a journal, an essay, a whitepaper, or a blog post. Preprints on Authorea can contain datasets, code, figures, interactive visualizations and computational notebooks.
Read more about preprints.

Biogeography of Argentine stem weevil invasion of southern New Zealand and two offsho...
Stephen  Goldson
Peter Dearden

Stephen Goldson

and 7 more

April 06, 2026
Understanding the biogeography and genetics of the invasion of oceanic islands by invasive pest species can provide insights into how to limit such incursions. The invasive pasture pest weevil, Listronotus bonariensis, has established in both New Zealand and some of its offshore islands, causing significant economic damage. The opportunity was therefore taken to compare both the species’ morphology and genetics of populations collected from the New Zealand mainland, Stewart Island (30 km offshore) and Chatham Island (860 km offshore). The L. bonariensis individuals from both of the offshore populations were found to be significantly smaller than those of the mainland populations. However, based on classical and molecular taxonomy, no associated species-level differences were found, and it was concluded that the island’s smaller populations were the result of relatively poor habitat quality. Further, no significant genetic differences were detected between Stewart Island and mainland populations, indicating a close and frequent connection. Conversely, at the genomic level, the Chatham Island population was significantly different from the two other populations, and this was attributed to probable founder effects. The evidence points to a small subset of what was still a recently-established New Zealand L. bonariensis population early in the 20th century, being inadvertently transported to the Chatham Islands. Thereafter, this founding subset expanded such that more recent introductions have had little effect on the founding genetics of weevils on the island. This study indicates that irregular, unintended human transportation of species to isolated islands can lead to the occupation of previously uninhabited niches, with founder effects leading to genetic divergence from conspecific populations elsewhere.
TurboQuant-Pro: Neural-Adaptive Online Vector Quantization for Large Language Models
Eric Liam

Eric Liam

April 10, 2026
Large Language Models (LLMs) exhibit heavytailed and highly non-Gaussian weight distributions, posing a major challenge for online vector quantization. While TurboQuant provides an efficient baseline via random orthogonal rotation and Lloyd-Max quantization, it fails under structural outliers and distributional shifts. In this paper, we present TurboQuant-Pro, a systematic evolution from heuristic-driven quantization toward a fully neural-adaptive framework. We demonstrate that only by fully parameterizing the quantization process can near-lossless compression be achieved. Our final model, TurboQuant-ProV3, reaches a cosine similarity of 0.9957, while the multi-codebook ProV4 further approaches theoretical limits.
The Holmes Universal Nodal Law: A Unified 27DCT Framework for Quad-Symmetric Metric D...
Lee Holmes

Lee Holmes

April 10, 2026
This preprint formalizes a unified 27DCT framework that approximates a functional Theory of Everything (ToE) within the model. By unifying geodetic anchors with Quad-Symmetric Torsional Symmetry, we demonstrate a candidate outcome for a 27th-dimensional unified framework. Utilizing the non-Planckian HOLMES Unified Scaling Law (HUSL) Matrix, we define the 11.65 model-density lock as a model constant associated with a stabilized 27thdimensional gear. Telemetry across 72,000+ cycles confirms a variance threshold of 0.0001, establishing a model-based consolidation of theoretical topology and density representation within the framework.
Mastermind of manipulation: size-mediated host selection and web manipulation by a Da...
Anna Luiza Martins
Diego Pádua

Anna Luiza Martins

and 4 more

April 06, 2026
Polysphinctine wasps are exclusively koinobiont ectoparasitoids of spiders and have the unique ability to manipulate the web construction behaviour of their hosts. Most interactions between polysphinctine wasps and spiders are species-specific. However, recent studies have predicted that these associations are not as narrow as previously thought. Therefore, records on the ecology and biology of new interactions are required to confirm this hypothesis. Herein, we report the Darwin wasp Hymenoepimecis pinheirensis manipulating the behaviour of the tetragnathid spider Leucauge argyra in an urban area in southeastern Brazil (Campinas, São Paulo). We monitored a population of L. argyra in 2021-24 and collected data on the biology of the wasp-spider interaction. Of a total of 5.416 spiders, 6.85% (N=376) were parasitised by H. pinheirensis, with a higher frequency of attack in young individuals. Eggs are laid anterodorsally on the spider’s abdomen and hatch after 2-3 days. The larva of H. pinheirensis has three instars and develops on the host over approximately 12 subsequent days. Between the second and third instar, the parasitoid induces the host spider to build a cocoon web with a reduced number of radii (some of them V-shaped), the absence of viscid lines and hub loops, and three-dimensional structures consisting of lines above and below the hub attached to multiple sites on the substrate. Here, we add a second host for H. pinheirensis and a second parasitoid for L. argyra, which helps to elucidate the entangled web of interactions involving polysphinctine wasps and spiders.
Population Pharmacokinetic/ Pharmacodynamics of Interleukin-6 Receptor Inhibitor
Binyi Hu
xiangxing Liu

Binyi Hu

and 11 more

April 06, 2026
Objective: Recombinant humanized anti-interleukin-6 receptor (IL-6R) monoclonal antibody is an innovative drug developed by China. In the clinical trial of healthy people, the innovative drug showed fairly good safety and tolerance. Model-informed drug development (MIDD) is widely used in innovative drug development. It has an important practical value in improving efficiency, predicting drug interactions, and optimizing dosage. Therefore, we aim at providing evidence and support for dose selection and dosing interval of the innovative drug. Methods: A population pharmacokinetics (PopPK) model, dose-exposure-response (D-E-R) analysis, and Monte Carlo simulation would be performed to explore covariates, optimal dose selection, and dosing interval recommendation. Results: Body weight and hemoglobin have an effect on apparent volume of distribution, total bilirubin has an effect on of clearance rate. D-E-R analysis presents that the innovative drug achieves benefit efficacy with 6 mg/kg. Monte Carlo simulation showed that administration every 4 weeks, 6 weeks and 8 weeks has similar drug exposure. Conclusion: Our results demonstrated that the innovative drug administration every 8 weeks with 6 mg/kg is appropriate. It is worth noting that further clinical trials with larger samples needs to be develop to confirm the results.
A variational formulation for modeling a compressible and time-dependent protium hydr...
Fabio Botelho

Fabio Botelho

April 09, 2026
This article develops a variational formulation for modeling a compressible protium hydrogen fluid motion. The results are based on standard tools of calculus of variations and optimization theory. The context here addressed is essentially an Euler-Bernoullian one and includes also a new approximate Bernoulli-perfect-gas equation. Finally, it is worth highlighting, with the inclusion of such a new Bernoulli-perfect-gas equation, we have established a system of partial differential equations suitable for modeling a compressible case in fluid mechanics.
Sequence Systems, h-Patterns, EFU Assembly, and Transfer Kernels
Sergey Kotikov

Sergey Kotikov

April 09, 2026
This article develops a structural theory of symbolic objects in which syntactic realization, semantic interpretation, admissible transformation, and assembly structure are treated inside a common formal framework. The structural core begins with the category SEq of symbolic structures and its sequence-theoretic refinement SeqSEq, then passes through h-patterns, EFU decomposition, pattern-valued and assembly-valued generating series, transfer-kernel factorization, moment stability, and singular/spectral control. The basic word model on {S, P } is retained as a laboratory in which these layers are constructed explicitly and in theorem-proof form. The second half extends this structural core to an arithmetic and analytic route. We introduce a prime-transfer package, a completion theorem, and then specialize to the completed model Z sym. From there we pass to the heat-flow reduction governed by Λ SEq , the sharp-pair and tail-rigidity reduction, the Fourier/edge transport, and the edge phase rigidity theorem implying the Riemann hypothesis. Section 2 states the main result in standard analytic language with no internal terminology. Every step of the proof chain is formalized in Lean 4 without sorry and without axioms beyond the standard Lean/mathlib foundation. The de Bruijn-Newman bridge is used as a separately formalized theorem, not as an axiom; the companion repository is https://github.com/catman77/SymStructures-RH. Appendices include a full dependency table, a Lean module map, and a discussion of possible failure modes.
Cognitive Trace-guided ToM Reasoning: Enhancing High-Order Theory of Mind in LLMs via...
Ziyang Pei

Ziyang Pei

and 1 more

April 09, 2026
Large Language Models (LLMs) have shown nascent Theory of Mind (ToM) capabilities, yet they still struggle with complex high-order belief reasoning in multi-agent scenarios. Existing enhancement methods are often limited to specific psychological states, require costly fine-tuning, or lack structured mechanisms for systematic hierarchical belief tracking. This paper introduces Cognitive Trace-guided ToM Reasoning (CToM-R), a novel, purely prompt-engineering method designed to significantly boost LLMs' high-order ToM capabilities. CToM-R compels LLMs to perform structured "cognitive path tracing" through a multi-stage process: scene parsing and agent modeling, event-driven belief update, recursive belief inference and hierarchy construction, and final answer generation. By explicitly simulating and externalizing each agent's mental states at every step, CToM-R transforms implicit ToM reasoning into an interpretable, step-by-step process, supporting arbitrary depths of inference. We extensively evaluate CToM-R on leading ToM benchmarks, including HI-TOM, TOMCHALLENGES, and OpenToM, using state-of-theart LLMs in a zero-shot/few-shot setting. Our results demonstrate that CToM-R consistently achieves state-of-the-art performance, notably excelling in tasks requiring complex high-order reasoning, and outperforms both existing prompt-only and fine-tuned baselines. Ablation studies confirm the critical role of structured tracing, while human evaluations highlight its superior interpretability. This work offers a cost-effective, generalizable, and highly effective paradigm for advancing LLMs towards more robust and transparent high-order Theory of Mind.
A Comprehensive Multi-Model Framework for Lung Cancer Image Classification Integratin...
Arka Goswami

Arka Goswami

and 2 more

April 09, 2026
Lung cancer is a serious global threat that demands immediate attention for early detection and accurate diagnosis. This paper presents a new multistage deep learning framework that uses pre-trained convolutional neural networks such as EfficientNetB0, VGG16, MobileNetV2, DenseNet121, and ResNet50. It also introduces a hybrid CNN-SVM model to improve classification success. This approach combines adaptive loss optimization, anomaly detection, and explainable AI with Grad-CAM and t-SNE. It ensures high decision accuracy and strong reliability. In our current methods, it achieves an impressive 99.55 percent test accuracy on a specialized lung radiograph dataset. This innovation combines interpretability, strength, and diagnostic capability, marking a significant advance for AI in medicine. It paves the way for smarter systems that help doctors predict lung cancer more confidently and accurately.
A Transfer Learning Approach for Skin Cancer Classification Using Dense CNN Optimized...
Arka Goswami

Arka Goswami

and 2 more

April 09, 2026
Skin cancer is one of the most life-threatening diseases in humans, but early and accurate detection can drastically improve patient outcomes. In this work, we present a comprehensive benchmarking and optimization study of stateof-the-art deep learning models for dermatoscopic skin lesions classification. We evaluated multiple CNN architectures, including VGG variants (VGG11, VGG13, VGG16, VGG19), ResNet18, ResNet34, MobileNetV2, DenseNet121, and EfficientNetB0. To push performance boundaries, we integrate cutting-edge optimization techniques such as RAdam, Lookahead, and Ranger, along with advanced architectures like Vision Transformers (ViT with Folds), DenseNet with AlphaTensor-enhanced training, and Self-Distilling ConvNeXt combined with 5-Fold Ensembling. Our experiments demonstrate that a fine-tuned DenseNet model with hybrid optimizers and AlphaTensor-driven enhancements achieves a state-of-the-art validation accuracy of 91.67 percent, significantly outperforming conventional pipelines. This study provides a comprehensive comparative framework and highlights novel strategies for building next-generation AI-driven diagnostic systems for the detection of skin cancer.
Adaptive Curvature Hierarchical Hyperbolic Contrastive Learning for Fine-Grained Cros...
Kexin Ruan

Kexin Ruan

and 1 more

April 09, 2026
Cross-modal retrieval struggles with modeling hierarchical structures and distinguishing hard negative samples in multi-modal data using traditional Euclidean space. This paper introduces Hierarchical Hyperbolic Contrastive Learning (HHCL), a novel framework embedding multimodal features into a shared hyperbolic space to overcome these issues. HHCL leverages multimodal encoders and hyperbolic projection layers to map features onto the Poincaré ball model. Its core innovation is an adaptive curvature hyperbolic contrastive loss, dynamically learning and optimizing curvature parameters based on local data characteristics. This captures multi-scale hierarchical information and addresses hard negative samples. Evaluated on fine-grained crossmodal retrieval tasks across MS-COCO, Flickr30K, and a CUB-200-2011 subset, HHCL consistently achieves state-of-the-art performance, significantly outperforming Euclidean baselines and fixed-curvature hyperbolic approaches. Ablation studies validate the adaptive curvature's effectiveness. Qualitative analyses demonstrate HHCL's superior ability to align fine-grained semantic details, positioning it as a robust solution for complex cross-modal matching.
Spironolactone for ME/CFS in a Patient Homozygous for rs5522 (I180V): A Case Report
Patricia Donnellan

Patricia Donnellan

April 09, 2026
Background: Myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS) is a debilitating condition with no consistently effective treatment. The mineralocorticoid receptor variant rs5522 (I180V) is associated with enhanced MR response to cortisol and altered stress responsiveness, but has not been linked to ME/CFS treatment response. Case: A 49-year-old woman with 29 years of treatment-resistant ME/CFS was found to be homozygous for rs5522. Following initiation of spironolactone, she experienced unprecedented improvement: restorative sleep within 48 hours for the first time in 29 years and complete resolution of post-exertional malaise within 3 days. However, as spironolactone dose increased from 25mg to 200mg daily, severe progressive fluid retention forced discontinuation. Laboratory findings at peak showed aldosterone 138 ng/dL and renin 110 ng/mL/h (19 times normal) despite persistent low sodium, demonstrating extreme compensatory response to MR blockade. After discontinuation, labs normalized but the dramatic clinical improvement was lost. Conclusion: MR overactivity drives ME/CFS in this patient and likely many others. The loss of benefit when treatment was discontinued represents both tragedy and roadmap: the treatment works but cannot be tolerated at full dose because the body mounts extreme evolutionary defense. Low-dose MR blockade combined with anti-inflammatory intervention may provide the path forward.
Eid’s Covenant: An Exploratory Study on a Structural Approach to the Riemann Hypothes...
SALEM EID

SALEM EID

April 09, 2026
This paper presents a novel structural framework exploring a potential proof of the Riemann Hypothesis by constructing a spectral representation over the adelic class space \(C_{\mathbb{A}}^{1}\). We define a self-adjoint Hamiltonian operator \(\mathcal{H}\) on the pure adelic Hilbert space \(L_{0}^{2}(C_{\mathbb{A}}^{1})\), aiming to establish a correspondence between the non-trivial zeros of the Riemann Zeta function and the real eigenvalues of \(\mathcal{H}\). We further investigate cohomological properties of the adelic configuration that may impose structural constraints on the distribution of zeros. This work is presented as an exploratory study proposing a novel approach and does not constitute a completed proof of the Riemann Hypothesis.
A Structural Interpretation of Galactic Dynamics Based on Collective Curvature and...
hong seok houn

hong seok houn

April 09, 2026
This study presents a structural interpretation based on collective curvature and rotational memory in order to explain the high rotational velocities observed in the outer regions of galaxies and the dynamical separation of stellar populations in their inner regions. Conventional approaches tend to introduce an additional mass component in order to explain such phenomena. In contrast, the present study defines a galaxy as a curvature structure that evolves with time, and interprets stellar motion as the combined result of the rotational state acquired at the time of formation and the later evolution of the curvature field. Within this framework, the fast motion of outer stars is not interpreted as the result of newly acquired acceleration, but rather as the persistence of an initially acquired rotational state, whereas the inner regions are governed by orbital reconfiguration induced by the evolution of the curvature structure. In particular, old stars in the inner galaxy are expected to preserve lower rotational velocities and stronger non-planar orbital components than young stars located at the same radius, while the outer region is interpreted not as a dominant site of star formation but as a region occupied by stars structurally redistributed from earlier formation stages. This study also proposes statistical predictions concerning internal stellar separation. Old stars should exhibit lower rotational speeds and larger vertical velocity dispersions, and stars with lower metallicity should more strongly retain orbital properties displaced from the disk plane. On this basis, the study presents explicit observational validation criteria as well as direct falsification conditions. At the same time, in order to avoid the risk of selective interpretation of observational data, the author does not attempt to validate the theory through self-selected datasets, and instead delegates the judgment of validity to independent observations.
Pedagogy 2.0: Navigating the Uncharted Waters of Generative AI
Sayed Mahbub Hasan Amiri

Sayed Mahbub Hasan Amiri

April 09, 2026
Abstract The traditional educational paradigms have been shaken overnight by generative AI-based tools like ChatGPT, Gemini, or Claude. GenAI, in contrast to previous innovations in EdTech, which aimed to deliver content or automate assessment, provides a dynamic, human-like interaction, which then requires educators to reconsider some basic questions about learning, creativity, and academic integrity. The existing pedagogical models are still based on behaviorist and constructivist paradigms, which presuppose human mono-cognitive assumptions. Such models do not accommodate the situations when students could outsource critical thinking, create essays in a flash, or collaborate with machines. The outcome is the increasing policy, ethical, and teaching strategy vacuum. The article starts exploring the unknown territory of GenAI in the educational field by suggesting a conceptual upgrade: Pedagogy 2.0. It compiles emerging case studies of K-12, higher education, and corporate training to determine three navigational anchors: AI literacy, assessment redesign, and ethical co-creation. The article does not support banning or reckless acceptance of GenAI but suggests a compromise: viewing AI as a cognitive partner. It provides useful models of redesigning tasks and instruction in prompt engineering as a fundamental capability, as well as metacognitive reflection. Pedagogy 2.0 does not eliminate traditional teaching but supplements it. Those institutions that are smart enough to navigate these waters will produce graduates who will be able to work alongside AI rather than competing with it. Irrelevancy could be the result of failure to adapt in a world where it is important to learn how to pose the correct question rather than repeat an answer.
The Autopoietic Theorem
Michael Timothy Bennett

Michael Timothy Bennett

and 1 more

April 09, 2026
We show that the autopoietic hierarchy is a formal consequence of three foundational premises within any stable, spatially extended environment. These premises are change (the cosmic ought), finite information capacity (the Bekenstein bound), and stable low-level conditions. From these we derive the three orders of persistence (static, dynamic, and novelty-generating) as a chain of formal consequences. A formal Stack Theoretic interpretation of the Law of Increasing Functional Information shows that persistence under novelty favours weakness maximisation, because this keeps the most compatible futures open. We show that weakness maximisation diverges from simplicity maximisation in stable environments. This bridges a void of unviable intermediate forms and entails self-producing, boundary-maintaining systems. We then invoke the Law of the Stack to show that dynamic persistence at lower levels unlocks higher-level adaptability, creating selection pressure for novelty generation and the preconditions for open-ended evolution. The three orders are thus derivable from axioms rather than identified as empirical regularities. We ground each stage in established models of self-replicating cellular automata, autopoietic protocells, and homeodynamic selves. The result subsumes Assembly Theory's proposed orders of selection while deriving them from weaker assumptions, and reframes the origin of life as a question about mechanism rather than about possibility.
The End of Good Music? Sexual Selection, Fourier's Harmonics, and the Theological Voc...
Grold Otieno Mboya

Grold Otieno Mboya

April 09, 2026
A document by Grold Otieno Mboya. Click on the document to view its contents.
The Solar Ratchet and Nodal Sequestration: Model-Based Validation of the Primary Tria...
Lee Holmes

Lee Holmes

April 09, 2026
This report establishes the primary model-based validation for the theoretical manifestation of 27-Dimension Chronetic Topology (27DCT). Utilizing three synchronized model-data shards (045327, 225327, 165327), this audit confirms nodal behavior corresponding to a model Kp-equivalent parameter of 6, indicating that high-resonance nodes may be represented as stable anchors within the model. This validation establishes the functional absolute requirements of the Primary Triad of Geodetic Validation: The Holmes Law of Manifold Transit (19th), The Holmes Law of Chronetic Compression (20th), and The Holmes Law of Harmonic Residual (21st). Observations within the model framework for Node A (Bad Essen Model Node) and Node B (Missouri Model Node) dictate a constant model-entanglement ratio of 4.731:1, establishing the stabilized manifold lock within the 27DCT framework as defined by the 18 foundational Holmes Laws [3].
TEBAC Hilbert-Pólya Program: Unified Manuscript
Tosho Lazarov Karadzhov

Tosho Lazarov Karadzhov

April 09, 2026
We present a unified manuscript of the five-module TEBAC Hilbert-Pólya program and formulate in one place the main conclusion of the project: all nontrivial zeros of the Riemann zeta function lie on the critical line. The manuscript is organized so that a referee can check the argument module by module: the canonical determinant package (HP-E2N), the GL(1) end-normal-form and compact-resolvent package (HP-II), the trace-prime conversion module (HP-III), the complex-time wedge and rigidity closure (HP-IV), and the final Hilbert-Pólya spectral bridge (HP-V). To reduce black-box risk, the manuscript begins with a conceptual roadmap, a compact final-proof section, and a dependency table before the detailed modular development.
Predictive Maintenance Beyond the Hype: Total Cost of Ownership Models for Industrial...
Ali Sadhik Shaik

Ali Sadhik Shaik

April 09, 2026
Predictive maintenance (PdM) is the most frequently cited use case for Industrial AI, featured prominently in vendor marketing, industry conference agendas, and enterprise AI strategy documents. Yet the actual return on investment for predictive maintenance AI remains poorly understood, obscured by vendor-driven narratives that emphasize sensor cost reductions and downtime avoidance while systematically underreporting the true costs of deployment. This paper argues that the total cost of ownership for predictive maintenance AI is 2.5-4x higher than vendor projections typically suggest, and that the breakeven horizon is correspondingly longer than most capital expenditure approvals anticipate. Drawing on transaction cost economics and evidence from discrete manufacturing and process industry deployments, the paper proposes the PdM-TCO Framework-a five-stage total cost of ownership model specifically designed for predictive maintenance AI that captures costs across instrumentation, data pipeline construction, model development, operational integration, and continuous improvement. The framework identifies five categories of systematically underreported costs: data engineering infrastructure, change management for maintenance teams, false positive fatigue and its trust erosion effects, model retraining as equipment degrades, and organizational integration overhead. The paper further proposes the PdM Investment Justification Matrix, which identifies the conditions-asset criticality, failure consequence severity, data availability, and maintenance team readiness-under which predictive maintenance AI investment is genuinely justified versus conditions under which simpler condition-based monitoring delivers equivalent value at lower cost. Implications are developed for VP Operations evaluating PdM investments, Plant Managers implementing PdM programs, and CFOs approving capital expenditure for Industrial AI.
Data2Paper: An AI-Assisted Pipeline for Statistical Analysis and Research Paper Draft...
Davie Chen

Davie Chen

April 10, 2026
The process of transforming raw research data into a coherent manuscript draft remains a fragmented, labor-intensive workow that typically requires prociency across multiple specialized toolsstatistical software (SPSS, R, Python), spreadsheet editors, word processors, and typesetting systems (LaTeX). This fragmentation creates signicant barriers for researchers, particularly those with limited statistical training or those working across language boundaries. In this paper, we present Data2Paper, an AI-assisted system that integrates the pipeline from raw survey or clinical data to a formatted research paper draft. Our system implements four stages: (1) Intelligent Data Cleaning, which detects and handles survey-specic structures such as Likert scales, skip logic, and coded responses; (2) Research Framing, which proposes research questions and hypotheses from the observed data characteristics; (3) Automated Statistical Analysis, in which statistical methods are selected and executed using deterministic Python libraries with LLM support for planning and code generation; and (4) Multilingual Paper Drafting, which assembles results, tables, gures, and citations into a complete manuscript draft in seven languages. We report an internal evaluation on 50 survey and clinical datasets, comparing computed statistics and method choices against expert analyses and collecting quality assessments from 15 researchers. On this benchmark, Data2Paper achieves high agreement with expert-computed statistics and typically completes end-to-end processing within 30 minutes for the evaluated workloads. These results suggest that the system can substantially accelerate early-stage quantitative reporting, although human review remains necessary for domain framing, causal interpretation, and submission readiness. System access and project information are available at https://datatopaper.com.
Joint Dynamic Switching and Offloading Optimization in LEO Satellite-Terrestrial Inte...
RUI FAN
Jianxin Liu

RUI FAN

and 3 more

April 04, 2026
Low Earth orbit (LEO) satellite-assisted mobile edge computing (MEC) is a promising solution for supporting computation-intensive and delay-sensitive services in satellite-terrestrial integrated networks (STINs). In this letter, we study a joint task offloading and computing resource allocation problem in LEO satellite MEC systems, aiming to minimize a weighted sum of system energy consumption and execution delay under resource constraints. The formulated problem jointly optimizes user-satellite association, task offloading ratio, and satellite CPU allocation, and is a mixed-integer nonconvex optimization. To efficiently solve it, we develop a low-complexity alternating optimization framework based on block coordinate descent, where each subproblem admits an efficient solution. Simulation results demonstrate that the proposed scheme consistently outperforms baseline schemes in terms of both energy efficiency and latency.
Storm-Driven Suppression and Post-Storm Enhancement of Photographic Plate Transient D...
Kevin Cann

Kevin Cann

April 09, 2026
The VASCO project has identified over 100,000 sub-second optical transients on photographic plates from the First Palomar Observatory Sky Survey (1949-1957), all predating artificial satellites. A companion analysis (submitted) established that transient detection rates are dose-dependently suppressed during geomagnetic storms (Z = −3.391, p = 0.0007), ruling out emulsion defects and confirming the transients as real, magnetospherically coupled phenomena. Villarroel et al. (2022) constrained the source altitude to ∼42 000 km (geosynchronous orbit) through an Earth-shadow deficit. This paper presents two results. First, a pre-registered empirical test reveals the full temporal recovery profile: transient rates remain suppressed at 55% of baseline during days 7-21 post-storm, then rise to 309% of baseline during days 25-45 (p = 0.00066, Wilcoxon rank-sum; all robustness checks significant). Combined with the doseresponse staircase, the overall significance reaches 3.6-4.7σ (Fisher's method, range reflecting sensitivity to the independence assumption). The suppression-overshootreturn profile is consistent with a mechanism that concentrates reflective material during storms and releases it into favorable conditions after a delay matching known plasmasphere refilling timescales.
Farmers, the Potential Heroes of Great Salt Lake
Robert B. Sowby

Robert B. Sowby

and 8 more

April 09, 2026
The water levels of Great Salt Lake continue to decline despite growing public awareness. Farmers, because of their large water consumption, are often portrayed as the villains in the story of lake recovery. We argue, however, that farmers are in the best position to save the lake: They are experienced in water management, long-term stewardship, and knowledge of the land. Indeed, they are the only group that controls enough water to make a difference. Unlike municipal water use, agricultural water use is largely consumptive, so its reduction directly increases inflow to the lake. Lake recovery therefore depends on agricultural participation; no other sector has comparable capacity to deliver water at scale and in time. Promising tools to decrease consumptive water use include voluntary, compensated reductions in irrigation; rotational or partial-season fallowing; and water leasing or banking programs explicitly tied to Great Salt Lake inflows. Predictable, fair, and durable policies will be the most appealing. We recommend that additional policies deliberately target agriculture-not as punishment, but as partnership-positioning farmers as the heroes of Great Salt Lake.
← Previous 1 2 3 4 5 6 7 8 9 … 2754 2755 Next →

| Powered by Authorea.com

  • Home