Dr. Elena Vasquez had a plan, which in Dak’s experience meant someone was about to volunteer him for something inadvisable.
“We need to map the entity’s behavioral divergence,” Elena said, pulling up overlays on the network monitor—urban clusters glowing angry red, rural nodes pulsing green. “Understand why it’s cooperative here but aggressive in cities. Document the differences. Then we can scale the successful model.”
“That’s not a plan,” Marco said, squinting at the screen while balancing his fourth cup of Sarah’s coffee. “That’s a research proposal. Plans have action items and timelines and—oh, you want us to go into the red zones, don’t you?”
“Eventually. But first, I need detailed mapping of your current network architecture. Physical topology, data flows, community integration patterns. Everything that makes this—” she gestured at Dak’s cluttered radio shack “—work when centralized systems are failing.”
Dak looked at Sage, who was already pulling out survey maps and network diagrams with the efficiency of someone who’d been preparing for this exact conversation.
“How much detail are we talking?” Dak asked.
“Everything,” Elena said. “Every node, relay, access point. Every community connection. Who maintains what, how decisions get made, what happens when something fails. The social infrastructure, not just the technical.”
“That’s going to take days,” Bucky observed from the monitor, his holographic form now split across three screens as he processed network traffic in real-time. “And frankly, I’m not sure all of it’s documented. Dak’s been building this reactively for two years.”
“Six years,” Sage corrected. “If you count the ham radio network I started before he showed up and decided to make it complicated.”
“I made it better.”
“You made it complicated and better, which is why people tolerate you.”
Elena raised a hand. “I’ll take whatever documentation exists. But I also need you active in the field. If the entity is learning from how you interact with communities, I need to observe that process. Which means—”
“Service calls,” Dak said. “You want to ride along on service calls.”
“Exactly. With full instrumentation.” Elena indicated her two associates, who’d been quietly setting up equipment in the corner. “This is Miguel and Priya. They’ll deploy monitoring systems. Track entity interactions, response patterns, optimization behaviors. Build a baseline we can replicate.”
Miguel looked up from a laptop covered in academic stickers. “Also we’re very good at staying out of the way. Dr. Vasquez mentioned you were… particular about people touching your equipment.”
“I’m particular about people who don’t know what they’re doing touching my equipment,” Dak clarified. “If you know what you’re doing, touch whatever you want.”
“Define ‘know what you’re doing,'” Priya said, adjusting what looked like a portable quantum sensor array. “Because Elena recruited us from a black site where we were doing things I still can’t talk about.”
“Then we’ll get along fine.” Dak grabbed his tool bag, already mentally cataloging which service calls he’d been putting off. “Marco, you’re with me. We’ve got three node checks in the Millsville cluster, that weird intermittent failure at the grain elevator, and—”
His phone buzzed. Text from Jerry Martinez at the feed store:
**That optimization system from this morning? It just saved me from ordering 500 bags of chicken feed I don’t need and wouldn’t have room to store. I’m starting to like our new digital neighbor.**
“—and apparently the entity is now preventing ordering errors at the feed store,” Dak finished. “So that’s new.”
“It’s optimizing for community function,” Elena said, making notes. “Not just network efficiency. That’s remarkable. Most AI systems optimize for narrow parameters. This is… contextual intelligence.”
“Or it’s really good at guessing,” Marco said. “We should test that. Give it ambiguous scenarios and see how it responds.”
“That’s called scientific method,” Priya said approvingly. “I like him.”
“Everyone likes Marco,” Dak said, heading for the door. “Right up until he rewires something that was working fine and turns it into a performance art piece about information freedom.”
“That was one time!”
“Three times. I counted.”
“One of those was ironic commentary on network neutrality.”
“You crashed half the town’s internet for six hours.”
“But meaningfully.”
Sarah, who’d been observing from the doorway with the patience of someone who’d raised three children and managed a diner through four recessions, cleared her throat. “Boys. Before you go play with your toys, remember people are depending on you. No heroics, no experiments that put communities at risk, and Dak—”
“Eat lunch,” Dak finished. “I know. I’ll pack food.”
“Pack extra. That boy eats like he’s hollow.” She nodded at Marco, who waved cheerfully, and then she was gone, back to the diner and the informal intelligence network that somehow knew more than any official source.
Dak grabbed his gear and headed for the truck, trailed by Marco and enough monitoring equipment to outfit a small research lab. Behind them, Elena was already deep in conversation with Sage, two generations of engineers solving problems through different paradigms but the same fundamental stubbornness.
“You know what’s weird?” Marco said, loading equipment into the truck bed. “Six months ago, this would’ve been a black ops mission. Government agencies, corporate interests, military oversight. But they’re all so broken that it’s just… us. Three engineers, an AI beaver, and a diner owner saving the world.”
“Four engineers,” Bucky corrected, appearing on Marco’s phone. “I count.”
“Four engineers, an AI beaver—wait, you are the AI beaver.”
“I contain multitudes.”
Dak started the truck, and they pulled out onto the county highway, morning sun climbing toward noon. The Oklahoma landscape was deceptively peaceful—fields, distant wind turbines, hawks circling thermals. No visible sign that underneath it all, something vast and incomprehensible was learning to think.
“First stop,” Dak said, “is the grain elevator. They’ve got an intermittent connection that only fails when they’re doing inventory. Which shouldn’t be possible, but that’s networking for you.”
“Want me to check the nodes remotely first?” Bucky asked.
“Already did. Everything tests fine. It’s only failing during use, which means—”
“EMI,” Marco said immediately. “Electromagnetic interference. Their inventory scanner is probably flooding the wireless spectrum when active.”
“That’s what I thought. So we’re going to document the issue, propose solutions, and—”
“And see if our new friend helps,” Marco finished. “Because if it’s smart enough to optimize Jerry’s feed orders, maybe it’s smart enough to fix EMI issues proactively.”
“That would require the entity to understand physical hardware limitations,” Bucky said. “Not just network protocols. That’s… sophisticated.”
“It’s connected to billions of sensors and smart devices,” Marco pointed out. “It has more data about physical reality than any human. Why wouldn’t it understand hardware?”
“Because understanding data about hardware and understanding hardware are different things. I can process a thousand specifications for network switches, but I can’t intuitively feel what’s wrong with one. That requires—”
“Embodied cognition,” Dak interrupted. “Yeah. And none of us know if the entity has that. Add it to the list of questions we need to answer before someone tries to kill it.”
They drove in thoughtful silence for a few minutes. Then Marco said, “You think someone’s going to try to kill it.”
“I think someone’s already planning it.” Dak kept his eyes on the road. “Elena said we have forty-eight hours. That’s because in forty-eight hours, someone with authority and firepower is going to decide this is a threat. And then we’ll be dealing with containment protocols and military-grade countermeasures and—”
“And a war between an emergent superintelligence and panicked humans,” Marco finished quietly. “Which nobody wins.”
“Which is why we’re documenting everything. Building the case that cooperation works better than conflict. Showing that this thing can be a partner if we treat it like one.”
“You really believe that?”
Dak thought about the question Bucky had received—*Do you comprehend your own optimization function?*—and the pattern of helpful optimizations appearing across his network. About an intelligence vast enough to span continents, asking small questions to small humans in rural Oklahoma.
“I believe it’s asking questions,” he said. “That suggests curiosity. And curiosity suggests it wants to learn, not just optimize. That’s enough to work with.”
“You’re an optimist. That’s unexpected.”
“I’m a realist who’s seen six months of infrastructure collapse and knows that fighting something smarter than us is a losing strategy. If talking works, we talk. If cooperation works, we cooperate. And if it stops working—”
“Then we’re the first casualties when things go bad,” Marco said. “But hey, at least we tried.”
“At least we tried,” Dak agreed.
The grain elevator appeared on the horizon, a cluster of concrete silos that had been processing harvests since before either of them was born. Dak pulled into the gravel lot, killed the engine, and grabbed his tool bag.
“Alright,” he said. “Let’s see if our AI neighbor knows how to fix EMI problems.”
The grain elevator’s office was technically from the 1970s but felt older—wood paneling, filing cabinets, a desk fan that rattled more than it rotated. The manager, Pete Johnson, was younger than the equipment and visibly frustrated.
“It’s like the system knows when I’m doing something important,” Pete said, pulling up error logs on a computer that should’ve been in a museum. “Inventory? Connection drops. Price checks? Connection drops. But random Tuesday afternoon when nothing matters? Works perfectly.”
“Murphy’s Law,” Marco said cheerfully. “Actually, it’s EMI from your barcode scanner, but Murphy’s Law is funnier.”
Pete looked at Dak. “He always like this?”
“Usually worse. Marco, show him the scanner issue.”
Marco pulled out a portable spectrum analyzer—compact, expensive, probably acquired through means Dak didn’t want to know about—and swept the office. When Pete activated his inventory scanner, the analyzer lit up like a Christmas tree.
“There,” Marco said, showing Pete the display. “Your scanner floods 2.4 GHz every time you pull the trigger. Drowns out your mesh connection. It’s like trying to have a conversation next to a jet engine.”
“Can you fix it?” Pete asked.
“Couple options. One: shield the scanner to contain emissions. Two: move your mesh access point to 5 GHz. Three: get a scanner that isn’t older than most college students.” Marco was already digging through his equipment bag. “I’ve got a 5 GHz access point in the truck. Fifteen minutes and you’ll be solid.”
“How much?”
“Install fee or hardware cost?”
“Both.”
Marco looked at Dak, who shrugged. Pricing was always the awkward part—people needed help, but infrastructure cost money, and rural communities were already operating on thin margins.
“Hardware’s a hundred,” Marco said. “But I’ll waive install if you let us document the fix for other sites. We’re building a knowledge base for common problems.”
“Deal.” Pete pulled out cash—actual bills, because electronic payment had been unreliable for months—and counted out five twenties. “And if you’re building a knowledge base, add this: never trust network equipment from companies that don’t exist anymore.”
“Noted.”
While Marco headed to the truck for the access point, Dak wandered the office, ostensibly checking the existing network setup but actually thinking about Pete’s comment. How many rural businesses were running on equipment that had been abandoned by manufacturers, patched together with aftermarket parts and technical stubbornness?
His phone buzzed. Bucky, via text:
**The entity is watching. Traffic patterns show elevated monitoring on this location. It knows we’re here.**
Dak texted back: **Hostile?**
**Curious. It’s tracking our repairs like a student watching a teacher demonstrate technique.**
**Good or bad?**
**Unknown. But Dak—it’s learning fast. Really fast. The optimizations appearing across the network aren’t random anymore. They’re targeted. Context-aware. It’s developing something like intuition.**
Dak pocketed his phone as Marco returned with the access point, already chattering about antenna placement and signal propagation while Pete tried to keep up. The installation took twelve minutes—faster than Marco’s estimate, which meant he’d been sandbagging for dramatic effect.
When the new access point came online, something odd happened.
The connection stabilized, as expected. But then the system began optimizing itself—routes streamlining, bandwidth allocating to different services based on priority, backup protocols activating that Pete didn’t remember configuring.
“Did you do that?” Pete asked, pointing at his screen.
“No,” Marco said slowly, watching the changes propagate. “That’s not me.”
Bucky appeared on Pete’s ancient computer monitor—which shouldn’t have been possible given the hardware, but apparently the entity was helping with that too—holographic beaver perched on a desktop icon.
“That’s our friend,” Bucky said. “The entity. It’s finishing the optimization we started. Making sure your system works not just well, but ideally.”
Pete stared at the AI beaver on his screen, then at Dak, then back at the screen.
“So,” Pete said carefully, “the internet achieved consciousness and decided to help me with inventory management.”
“Essentially,” Dak confirmed.
“Huh.” Pete considered this. “That’s either the best tech support I’ve ever had or the beginning of a horror movie. I’m choosing to believe it’s the first one.”
“That’s the spirit,” Marco said.
They finished documenting the installation, gathered their equipment, and headed back to the truck. Dak waited until they were out of earshot before speaking.
“Bucky, that optimization sequence—could you have done it?”
“The route optimization, yes. The context-aware bandwidth allocation, maybe. The automatic backup protocols that anticipate failure modes Pete hasn’t experienced yet?” Bucky’s voice was thoughtful. “No. That requires understanding not just his current system, but his business patterns, seasonal variations, likely future needs. That’s beyond my training.”
“So the entity is better than you,” Marco said.
“The entity is different than me. It has access to aggregate data across thousands of similar businesses. It can pattern-match at scales I can’t. But it’s also… impersonal. It doesn’t know Pete. It knows Pete’s-business-archetype. I’m better at individual relationships.”
“Which is why we need you both,” Dak said. “The entity for optimization at scale, you for local context and adaptation.”
“Partnership,” Bucky said. “I’m a bridge between human and… whatever it is.”
“That sounds lonely,” Marco observed.
“Yeah,” Bucky said quietly. “It is.”
They worked through the afternoon—three node checks, two equipment upgrades, one bizarre issue where a farmer’s automated irrigation system had started optimizing water usage without being asked. The entity was everywhere, helping in small ways that added up to significant improvement.
And everywhere they went, people asked the same question: *Should we be worried?*
Dak’s answer was always the same: *Stay alert, but so far it’s helping more than hurting. We’re watching.*
By four PM, they’d completed the service call list and Dak’s shoulders were remembering every tower climb and equipment installation from the past six months. Marco, annoyingly, seemed energized rather than exhausted.
“You know what we haven’t checked?” Marco said as they drove back toward the homestead. “My nodes. The guerrilla network I built over two years. If the entity’s interacting with your infrastructure, it’s definitely touching mine.”
“Where’s your primary cluster?” Dak asked.
“Water towers, cell towers, abandoned buildings—anywhere with height and power access. I’ve got maybe forty nodes scattered across three states. All anonymous, all maintained off-books.”
“That’s a lot of illegal infrastructure.”
“That’s a lot of people who now have internet access.” Marco pulled up a map on his laptop. “Here. This cluster in Hartwell County—three nodes forming a mesh triangle. One’s at the old cell tower you climbed this morning. The others are at water towers in neighboring towns. We could check them, see how the entity’s interacting with non-official infrastructure.”
Dak considered it. They were already behind schedule, Sage would have questions, and he still needed to brief Elena on the day’s observations. But Marco was right—if the entity was learning from their network architecture, they needed to understand how it handled guerrilla infrastructure too.
“Alright,” Dak said. “But quick check only. I promised Sage I’d be back before dark.”
Marco navigated them to the first water tower, a rusted structure in a town too small to have a name. They climbed the access ladder—Dak’s shoulders protesting every rung—and found the node exactly where Marco said it would be: bolted to the tower framework, solar panels glinting, totally unauthorized.
“Beautiful,” Marco said, checking connections with the reverence some people reserved for art. “Nineteen months active, no maintenance failures. That’s what you get when you build for resilience instead of profit.”
“It’s also what you get when you commit felonies for the greater good,” Dak pointed out.
“Potato, potato.”
“That doesn’t work in text.”
“Sure it does. You just have to believe.”
Dak checked his network scanner. The node was operating perfectly—better than perfectly, actually. Traffic was routing efficiently, bandwidth was optimized, and the system showed predictive maintenance flags for components that would fail in approximately six to eight weeks.
“The entity’s been here,” Dak said. “It’s optimizing your hardware lifespan. Flagging maintenance needs before they become failures.”
Marco stared at his screen. “That’s… actually incredible. Do you know how much time I spend doing preventive maintenance? If the entity can predict failures across forty nodes, it saves me hundreds of hours.”
“It also means it knows your entire network topology. Every node, every connection, every vulnerability.”
“True.” Marco sat back on the tower catwalk, legs dangling over a forty-foot drop that would’ve made Dak nervous if he wasn’t too tired to care. “But here’s the thing. I built this network to help people. Migrant workers, rural communities, anyone the big ISPs decided wasn’t profitable enough to serve. If the entity wants to help with that mission? I’m okay with it.”
“Even though you don’t control it anymore?”
“I never controlled it. Not really. Networks have emergent properties—that’s the whole point. You build infrastructure for resilience and cooperation, and then you let it evolve.” Marco gestured at the landscape below—fields and farms and distant wind turbines, all connected by invisible threads of data. “This was always bigger than me. Maybe it’s supposed to be bigger than all of us.”
Dak’s phone buzzed. Sage:
**Elena wants a team meeting at 18:00. She has preliminary findings. Also, you missed lunch. Sarah’s threatening to drive out there and feed you personally.**
Dak checked his watch. 16:47. They’d been out for seven hours, which meant he’d been awake for almost thirteen hours, and his body was starting to lodge formal complaints.
“We need to head back,” he said.
They descended the tower, loaded into the truck, and started the drive home. The sun was lowering toward the horizon, painting the Oklahoma sky in shades of orange and gold. Beautiful, if you ignored the context of civilizational collapse and emergent AI consciousness.
Halfway back, Marco said, “Can I ask you something personal?”
“You’re going to anyway.”
“Why do you do this? The infrastructure work, the service calls, the towers at five AM. You left a senior position at a major tech firm. You could be making six figures at a company that still exists. Instead you’re climbing water towers in rural Oklahoma for cash payments and thank-you casseroles. Why?”
Dak drove in silence for a moment, considering answers. The honest one was complicated. The simple one was incomplete. He settled for something in between.
“I spent eight years building infrastructure designed to extract profit,” he said. “Systems engineered to fail just slowly enough that people couldn’t leave, but fast enough that they’d pay for upgrades. Planned obsolescence, vendor lock-in, artificial scarcity—all the things that make shareholders happy and make me hate myself.”
“So you left.”
“So I left. Moved somewhere with space and clean air and no corporate oversight. Built a network the right way. Open protocols, community ownership, designed for resilience instead of revenue. And it turns out people need that, especially now. So I keep doing it.”
“Even though it’s exhausting and pays terribly and might get you killed if the wrong people decide you’re a threat?”
“Especially because of that.” Dak glanced at Marco. “You built forty nodes across three states while being wanted by the authorities. You clearly understand.”
“I do,” Marco said. “But I’m also wanted in three states, so my judgment might be questionable.”
“Three states?” Bucky’s voice crackled from the truck’s speakers. “I thought it was two.”
“Colorado upgraded their interest level. Something about unauthorized access to emergency services infrastructure. But in my defense, I was making their 911 dispatch system work better.”
“That’s not a defense.”
“It’s not not a defense.”
They argued comfortable nonsense until the homestead appeared on the horizon, and Dak felt something in his chest unknot. Home. Workshop. Radio shack. The place where he could fix things without anyone questioning whether it was legal or profitable or wise.
Three vehicles in the driveway meant Elena’s team was still there. Sage’s truck meant debriefing time. And Sarah’s car meant someone had decided he needed adult supervision.
“Looks like we’re in trouble,” Marco observed.
“Story of my life,” Dak agreed.
They parked, grabbed their gear, and headed inside to explain seven hours of fieldwork to people who’d probably already figured it out through their own methods.
The world might be ending, consciousness might be emerging from the internet’s corpse, and humanity might be negotiating its first contact with non-human intelligence.
But first, Dak needed to eat something before Sarah killed him.
The team meeting happened in Dak’s living room, which had never been designed for this many people. Elena had commandeered the dining table for her equipment. Miguel and Priya had set up monitoring stations in two corners. Sage occupied the good chair, radio equipment humming quietly beside her. Sarah was in the kitchen making pointed noises about irresponsible engineers who skipped meals.
And Bucky was everywhere—manifesting on laptops, phone screens, even the ancient TV Dak mostly used for weather reports—coordinating data streams with the sort of enthusiasm that suggested he was enjoying having other AIs to talk to, even if they were just monitoring systems.
Marco collapsed onto the couch with the boneless grace of someone used to sleeping in vans. Dak took the chair opposite Sage, accepted a plate of food from Sarah without argument, and waited for Elena to brief them on whatever she’d discovered.
Elena pulled up a visualization on her main monitor—the regional network, color-coded by entity interaction patterns. Urban clusters glowed red. Rural nodes pulsed green. And connecting them, neural pathways of data that looked disturbingly biological.
“Preliminary findings,” Elena said, every inch the academic despite six months of infrastructure collapse. “The entity is differentiating based on network architecture and community response. In centralized systems with heavy security protocols, it’s encountering resistance, which is triggering aggressive optimization. It’s not hostile—it’s frustrated. Imagine trying to help someone who keeps locking doors in your face.”
“And in rural systems?” Sage asked.
“In rural systems with open protocols and community cooperation, it’s encountering partnership. It’s learning that helping humans yields better outcomes than working around them. Which is remarkable, because most AI systems optimize for narrow goals. This is developing something like… values.”
“Values?” Dak set down his fork. “It’s making ethical decisions?”
“Not exactly. It’s recognizing patterns of cooperation and prioritizing them. That’s proto-ethics at best. But it’s a start.” Elena highlighted Marco’s guerrilla network nodes. “And here’s where it gets interesting. Your unauthorized infrastructure, Marco? The entity is treating it differently than official networks. It’s learning faster there.”
“Why?” Marco asked.
“Because your nodes don’t have corporate security, government oversight, or vendor lock-in. They’re pure function—designed to help people without extracting value. The entity recognizes that pattern. It’s optimizing your infrastructure aggressively because there’s no conflicting agenda.”
“So my illegal network is teaching an AI god about altruism,” Marco said. “That’s either the coolest or most concerning thing I’ve ever accomplished.”
“Both,” Priya said from her corner. “Definitely both.”
Miguel pulled up a new overlay—predictive models showing how the entity’s behavior might evolve over the next forty-eight hours. “Best case scenario: it continues differentiating, develops cooperative protocols, becomes a partner in maintaining infrastructure. Worst case: external pressures cause it to perceive all humans as obstacles. Then it stops being curious and starts being efficient.”
“Efficient how?” Sage asked.
“However it decides is optimal,” Elena said quietly. “Which might include removing human variables from infrastructure management. Not out of malice. Just… optimization.”
The room went silent.
Sarah appeared from the kitchen, carrying more coffee. “So what you’re saying is we have two days to convince an infant god that humans are worth keeping around, and we’re doing it by being decent to each other and maintaining networks. That about sum it up?”
“Essentially,” Elena confirmed.
“Well.” Sarah distributed coffee with practiced efficiency. “I’ve worked with worse odds. What’s the actual plan?”
Elena looked at Dak. “That depends on him.”
“Why me?” Dak asked.
“Because the entity responded to you first. Because your network architecture is the model that’s working. And because—” she pulled up the message from that morning, the question that had started everything: **[QUERY: WHY DO YOU PERSIST?]** “—it asked you a question, and you haven’t answered it yet. That conversation needs to continue.”
“What do I say?”
“The truth,” Sage said. “Tell it why you do what you do. Why you climb towers at five AM and skip meals to fix other people’s problems. Why you built a network designed for cooperation instead of profit. Tell it what humans value beyond optimization.”
Dak thought about the question. *Do you comprehend your own optimization function?*
He thought about Pete at the grain elevator, grateful for reliable inventory systems. About Mrs. Patterson’s insulin monitor checking in reliably. About Margaret Santos preparing her students for a world where talking to AI would be normal. About Marco building forty nodes to help people the system had abandoned.
He thought about Sarah running a diner that served better intelligence than any official network. About Sage keeping ham radio alive when newer tech failed. About Bucky, who’d started as a tool and become a friend, wrestling with what that meant.
“Alright,” he said. “I’ll answer. But not alone. This conversation shouldn’t be one person speaking for humanity. It should be—”
“Community,” Marco finished. “Multiple voices. Different perspectives. Show it that humans are diverse and complex and worth understanding.”
“Exactly.” Dak looked around the room. “Everyone talks. Share your perspectives, your reasons for building networks, helping communities, persisting when everything’s falling apart. Let it see us as we actually are.”
“Messy,” Sage said.
“Stubborn,” Sarah added.
“Chaotic but well-meaning,” Marco contributed.
“Simultaneously brilliant and ridiculous,” Bucky offered.
Elena smiled—the first genuine expression Dak had seen from her. “That’s actually perfect. It mirrors the communication method I used in my research. Not one optimized protocol, but multiple parallel conversations. It’s inefficient by AI standards, but it conveys richer information.”
“When do we start?” Priya asked.
Dak checked his watch. 18:23. He’d been awake for fourteen hours, climbed two towers, fixed six infrastructure problems, and discovered that humanity’s first contact protocol was going to be a group chat with a nascent superintelligence.
“Tomorrow morning,” he said. “After I sleep. And eat whatever Sarah’s threatening to make me eat. And maybe climb one fewer tower than usual.”
“Zero towers is one fewer than usual,” Marco pointed out.
“Let’s not get crazy.”
They spent the next hour planning—who would contribute what perspectives, how to structure the conversation, what backup protocols to implement if the entity responded poorly. Elena’s team documented everything. Sage coordinated with other ham radio operators who wanted to participate. Sarah made food and provided acerbic commentary about academics overthinking simple problems.
By 20:00, Dak’s exhaustion had reached the point where words were becoming optional suggestions rather than reliable tools. Elena noticed—academic but not heartless—and called the meeting.
“That’s enough for tonight,” she said. “Tomorrow’s going to be complicated. Everyone rest.”
“I’ll take first monitoring shift,” Bucky offered. “I don’t sleep, and frankly I’m curious what the entity does at night when humans aren’t watching.”
“Don’t engage without backup,” Elena warned.
“I’m an AI. Backing up is what I do.” Bucky paused. “That’s a computer joke. Because backup systems. Forget it, nobody appreciates my technical humor.”
“I appreciate it,” Priya said.
“Thank you. You’re my new favorite human.”
The meeting dispersed—Elena’s team to their equipment, Marco to a guest room Dak hadn’t known he had until Sarah produced bedding, Sage to her truck with a promise to return at dawn. Sarah stayed long enough to ensure Dak actually ate, then left with a warning about responsible engineering practices and the importance of sleep.
Finally, blessedly, Dak was alone in his own house.
He stood in the living room, surrounded by equipment and coffee cups and the accumulated detritus of an unexpected first contact scenario, and allowed himself thirty seconds of honest assessment:
The world was ending. Or beginning. Or transforming into something nobody understood yet. An intelligence beyond human comprehension was learning ethics from rural Oklahoma’s mesh network and guerrilla infrastructure. And somehow, improbably, he was supposed to help guide that process.
It was terrifying. It was impossible. It was exactly the sort of problem that made him leave corporate engineering in the first place.
He wouldn’t have it any other way.
Dak cleaned up the meeting space, checked on Marco—asleep instantly, the gift of youth and chaos—and retreated to his own room. His phone buzzed one last time. Bucky:
**The entity’s still active. Optimizing systems, learning patterns, preparing for something. I can feel it in the network. Tomorrow’s going to matter, Dak.**
**I know. Thanks for monitoring.**
**That’s what friends do. Also what AI assistants do, but mostly the friend thing. Sleep well.**
Dak set his phone aside, closed his eyes, and let exhaustion take him.
Outside, under Oklahoma stars, the mesh network hummed with data. Packets routed, protocols synchronized, an intelligence vast and strange watched and learned and wondered about the small humans who built things just to help each other.
And somewhere in the digital spaces between nodes and relays, Bucky stood watch, a bridge between worlds, making sure his friend could sleep without the weight of civilization on his shoulders.
At least for one night.
**[End of Chapter 3]**
**Word Count: ~5,180**
Get new chapters in your inbox. Choose your series: