Chadwick magic brain calculator directions

MentalMath: It's the thought that counts

2014.03.24 06:30 gmsc MentalMath: It's the thought that counts

Want to learn how to do math in your head, or even just wrap your head around a mathematical concept? This is the place!
[link]


2024.04.28 23:25 akai___hana need help finding my type, or confirming it.

I need help figuring out my type. after months of (casual) reading into socionics, and some superficial introspection, I have concluded I fit generally into ESE. but I often doubt my own judgment, and I'm very unsure of myself, (and also intellectually dependent on others, but more on that later lol) I wanted a second opinion. I'll be listing off random observations of my behaviocognition/some things about myself off the top of my head, and I hope the more knowledgeable people here could assess me based on those things. I know that it isn't enough to really understand someone and you could never really know a stranger through some post online, but like I said, I just need a second opinion/perspective to get some insight and this post is rather an experimental attempt to help me in typing myself correctly. and since the information may not be enough, I give liberty to assume things about me/my personality or nature based on what I say/claim about myself, and I also would like complete/brutal honesty, maybe I'd find out something I didn't realize about myself before. but just give a heads-up when you type me based on those assumptions. I'm a bit young, and my identity is unclear and unstable, sorry if I may come across as inept in describing/explaining myself, and excuse me if I make mistakes/contradicting statements, I really don't know myself that well and just say whatever comes to mind. I rant and ramble a lot, and I would often go on random tangents. so, if this post is too long for you, and you want to comment, just read the first part of every paragraph or skim through it, and give your take if you'd like. and if you have any questions, ask away.
I think I've made this long enough. what do you think? I'm sorry if my descriptions are too negative/distasteful as I'm just trying to be honest.
submitted by akai___hana to Socionics [link] [comments]


2024.04.28 23:23 Fancy_Abalone3990 Does anyone get incredibly intense brain zaps from this med?

This drug is incredible. It cured my social anxiety and other forms of it. The only problem is the brain zaps, they literally feel like I’m getting electrocuted. My entire body pulses every time my eyes move. This makes me so incredibly nauseous because I try to prevent my eyes from looking any direction other than straight. This lasts about 45 minutes and then goes away, but I still feel the nausea hours later. On 15mg twice a day and on every dose I get this symptom. It won’t go away :(
submitted by Fancy_Abalone3990 to BusparOnline [link] [comments]


2024.04.28 23:20 Born-Beach The One Beneath - Part 2 [Final]

Part One.
My jaw clenches. It’s my turn to go pale with shock. Suddenly, the puzzle pieces begin to connect in my mind. They’re building a picture that I’m not sure I want to see, but it’s a picture that’s becoming difficult to deny. “Why?” I press. “What makes you so sure they weren’t just test subjects like the others?”
“These felt different,” Maria says quickly. “Horrible in a way that even the others couldn’t compare to. It’s like when you look at a manikin, or a doll… What’s the phrase?”
“Uncanny valley,” I offer.
“That’s it,” she says. “That’s what I felt looking at these things, the uncanny valley. It was like they didn’t have a soul– like they never had a soul. Some looked human. Nearly. But they were too tall, or their limbs were too long, or they had too many teeth in all the wrong places. But what scared me most of all wasn’t the bodies, it was the thought that something had killed those things. Something had torn literal nightmares to pieces, and there was a good chance it was coming to do the same thing to me and John.
“John,” I say, still trying to parse his significance in her ordeal. “That many bodies couldn’t have appeared overnight. They’d been there for a long time. That means he probably knew about them, didn’t he?”
She nods, gasping. “He knew. He fucking knew. He shoved me onto that pile of corpses, that festering and decaying pit of monsters and told me as much. He started shouting. Call me a monster all over again. Evil, he said. Twisted. He kept pointing at me like all of this was my fault, and he hadn’t both led us to our deaths.”
Her voice becomes a stuttering mess. “A-all the while I heard that thing in the dark. Approaching. I felt terrified, hopeless and numb. I kept asking John why me? Why go through all this trouble just to kill me? And he told me that he didn’t have a choice. He knelt next to me, put a hand on my cheek and whispered that his child needed to feed. It was getting hungry. Desperate. He almost looked fucking r-remorseful if you can believe it, and he told me that he was really sorry, and that he hated to do this but… He stepped away from me. Stood against the wall of the chamber. Watched. Waited.”
For a second, I’m afraid Maria is going to break into fresh sobs, but she pushes through.
“I didn’t know what to do,” she continues, wiping tears from her cheeks. “I didn’t have anywhere to run, anywhere to hide, so I just lay there in that heap of monsters. I gave up. The whole time, those footsteps got closer and closer. The nearer they came, the slower they got. It was like it knew I was trapped. Like it’d done this before, and knew there wasn’t a rush…” She looks up at me. “Do you think… John did that to other people too?”
“It’s certainly possible. Did you get a good look at the creature?”
She shudders. “Yes. I had my headlamp trained on the passage the whole time, and when it appeared around the corner, I almost missed it. I heard it, but I could barely see it. It was a tall, flickering shadow. It pulsed. Vibrated. The way it moved was jerky, haphazard, almost like it had one foot in our reality, like it was glitching with every step it took.”
“Glitching…” I mutter. Why does that sound familiar?
“That’s right,” she says. “And that wasn’t even the strangest thing about it.” She gets small in her chair. “It had these eyes. Amber ones. Bright and gleaming, like twin cinders smoldering in empty space. It felt like they were piercing me, like its eyes were digging through my skin and looking into my mind. Or my soul. It was like that thing was taking bites out of my memories, tasting them before spitting back out…”
“How did it feel? Painful?”
“No, she says. “It felt cold. Like a blizzard in my head, like all my thoughts had frozen to a crawl. Maybe that’s why I calmed down. I don’t know… I remember sitting there, totally numb as the Shadow phased through the metal bars of the gate. It almost looked human. It had two arms, two legs and a head, but its body was made of black static. Like television interference.”
Television interference… Where have I heard that description before? I rack my mind for a match, some kind of urban legend or ancient lore that matches what she’s saying, but nothing jumps out. I flip through the pages of my clipboard, stopping on one labeled ABERRATE EVENTS. It’s The Facility’s own Most Wanted List. My eyes fly through the cases listed, but there isn’t anything close to what she’s describing.
An idea strikes me.
“Did the Shadow hurt you at all?”
She looks down at her arm. There’s a large gash there, framed by clots of dried blood. “No… I don’t think so,” she says hesitantly. “I got these injuries when I was trying to escape.”
No, of course it didn’t. It had other food available already. “And what happened after it pierced you with its eyes?” I ask.
“It walked past me,” she says. “It walked through that mulch of corpses and headed straight for John. It started speaking along the way. At least, I think it did.”
“What do you mean by speaking?”
“Do you remember how I said it was howling before?”
“I do.”
“Well, this time it was hissing– like a livewire, or static electricity. Whatever it was communicating, John looked panicked. He was crying. Pleading with it. He kept saying that he’d done his best, but there was nothing else out there, so the Shadow would have to make due with me. But the Shadow didn’t seem to care. It grabbed John by his long hair, lifted him up to the ceiling and its cinderlight eyes started gleaming an angry orange.”
My heartbeat races. My pen flies across the clipboard, desperately trying to avoid missing a single detail.
Maria keeps talking. She keeps giving me more of what I need. “John kicked and screamed,” she says. “He begged me to help him, told me that if I didn’t I was every bit the monster he’d said I was and I’d be next… But before he could finish, the Shadow’s eyes flashed and leaked fire. John started shrieking, moaning as his face melted into his skull.”
Maria’s face twists with revulsion. Disgust. She looks away, back to the bunker. I wonder if she’s hearing what I am– that dim rumble of something moving underground, that slow march of an approaching nightmare. Our clock is ticking. It’s not something I can tell her though, because as soon as she starts panicking, I lose the chance to connect the dots I need.
“Maria,” I say, pulling her attention back. “Continue. It’s critical I get these details.”
“Sorry… It’s not a memory I like thinking of but… The Shadow held John there, his legs twitching weakly, and then it grabbed his head and tore it off his neck.” She brings a hand to her mouth, starts nervously biting her nails. “Then it lifted John’s skull to its amber eyes. It opened its mouth and screamed fire. The heat I felt was like an open furnace, like Hell itself. Tendrils of darkness emerged from the Shadow, clutching at John’s scorched skull and cracking it open like an egg.
"His brain spilled out. The Shadow caught it in those tendrils, and brought it into itself. His brain. Like it was fucking assimilating it… Or eating it. ” She looks up at me, and there’s the same angry defiance I saw when we met. “Now do you get it?” she asks. “Now do you see what I mean about this thing being the devil? What else could do something like that?”
A good question. I can think of one entity. Only one. If my guess is correct, then Maria and I get to live to see tomorrow’s sunrise. If it’s wrong, then I need to put a bullet in both our heads before that thing finds us.
All of it hinges on my next question.
“It killed John, then what? What did the Shadow do?”
“It turned back to me,” she says. “It glared at me with those blazing eyes, and I thought I was next. I knew I was. But then I felt another blizzard sweep across my mind, and that was it– I blacked out.”
“Hang on…” I mutter. “What do you mean you blacked out? I found you lying outside of the bunker. How did you escape?”
She shakes her head, frantic. “I don’t have a clue. I blacked out, then the next thing I remember was waking up outside the bunker, with you pouring water on my face and telling me we needed to talk. That’s it.”
She shoots up from her chair. “Christ! We need to leave.”
I blink. “Why?”
“The police. I’ve gotta tell them about John and what he was doing. I’ve gotta tell them about this base. Maybe John brought others here. More victims. Maybe some of them are still alive down there and need help. We need search parties and–”
“Don’t bother,” I say.
She looks at me, stunned.
“The police won’t have any record of John. Tell them where you were, what you saw in that bunker, and they’ll probably kill you.” I reach into my pocket, pull out my lighter and run a thumb down the sparkwheel. It flickers to life. “Fact is, John doesn’t exist. Neither does this base.”
I bring the lighter to the edge of my clipboard. The flame catches a page.
“What the hell are you doing?” Maria exclaims.
“Saving your life,” I say, tossing the clipboard to the floor. It pops and cracks as the fire eats the woman’s story, one word at a time.
“What the fuck? You said you believed me!”
“I still do,” I tell her. “That’s the problem. An hour ago, I had no idea what was going on here, but the more you spoke, the more it started making sense. I realized that you and John were more right than wrong. That there really is a conspiracy here. A cover-up.”
“Then the people deserve to know!”
“They do,” I confess. “And they will, eventually– but not from you, and not from my report. Neither is an option.”
She shakes her head, incredulous. “Then how?”
I walk to the window, rest my hands against the edge. I take a breath. It’s humid, heavy with South American heat. “I’ll figure something out. I always do.”
There’s a heartbeat of silence. Then, she asks the obvious question. “It’s your employer, isn’t it? This whole thing has something to do with The Facility.”
“Yes,” I tell her. “I think it does.”
She appears at my side. The two of us stare out across the dark of the base, out at the steel hatch rising from the dirt, where a devil made flesh is inching ever closer. “I thought you said your job was hunting monsters,” she says at length, “not creating them.”
“My job is a lot of things. More than anything else, it’s complicated. The Facility is… Well, it’s not what I’d call a good organization. Or even a moral one.”
“Then what is it?”
I consider the question. “A pragmatic answer to an otherwise ugly question.”
She looks at me expectantly.
“The question of salvation,” I explain. “The question of how do you rescue humanity from a nightmare so twisted that it defies all language? All concept of imagination? There’s something coming for us, Maria, something dark and unfathomable, and these entities– these monsters might be our only chance at fighting back.”
She’s quiet. Her expression is difficult to read.
“Decades ago, The Facility was a very different organization,” I tell her. “In those days, they thought the approaching nightmare was right around the corner, that we had weeks or months until it showed up on our doorstep. They didn’t know. Out of fear, they greenlit any and every possible solution. Or at least, that’s what the rumors say.”
“Rumors?”
I nod, darkly. “There’s no real records of The Facility’s activities during the Cold War. Most documents were destroyed. The few that remain are heavily redacted. I wasn’t around then, obviously, but I picked up bits and pieces from old timers I’ve worked with. They mentioned black projects. Hidden programs. One project was particularly infamous, so much so that even now, half a century later, The Facility hasn’t entirely snuffed out its legend.”
“What project?”
“Project Judas,” I say. “If you believe the rumors, it was headed by a brilliant biochemist named Screech. Jonathan Screech. The aim of the program was to create the ultimate weapon, a monster that could assimilate targets into its being, absorbing their capabilities. Such a function would provide it with a near limitless power ceiling. The problem was–”
Something hits my ears. Maria’s hand finds my arm, squeezing it painfully.
“Do you hear that?” she hisses.
Steel rattles in the distance. There’s a low groan of warping metal, like the rungs of a ladder slumping beneath the weight of something titanic. There’s something beneath us. It’s inside of that bunker, climbing that old ladder, and it’s making its way to the surface.
“We’ve gotta run!” Maria tugs at my arm, but I keep my feet planted where they are. My eyes narrow. I stare at the now trembling steel wheel, lit up beneath the light of the jungle moon.
Maria stumbles backward. A smile finds its way onto my face. In the distance, across the ruins of the base, the bunker’s hatch is thrown open. A dark shape emerges. It buzzes like television static, framed in shafts of moonlight. Its twin eyes glow like cinders. The shadow lurches, looking around, scanning the base and emitting a low electric hum.
“That’s it…” Maria whimpers. “Oh God… that’s it…”
The creature sees us. It sees me. It takes a shambling step forward, and dust and dirt flies into the air beneath its weight. Its eyes smolder, growing and growing until they become a blaze of fire. Maria is on the ground. She’s hiding beneath the window sill, reefing on the fabric of my pants and pleading with me to run, but I hardly notice she’s there.
This shadow– this monster, is why I’ve come here tonight.
Now, we finish things.
A wave of arctic air passes through my mind. It’s just as she described. My heart slams as I feel this Shadow rifle through my thoughts, chewing on my memories. I close my eyes. I breathe deep, inviting it in. Go ahead. Have your fill.
And then with one final shiver, the cold in my skull fades. That Shadow retreats, pulls back from my mind and when I open my eyes, I see it gazing back at me. The fire in its eyes dims to that cinderglow. It tilts its head skyward. Six black wings burst from its back in a shower of static.
“What’s happening?” Maria asks frantically, still on the ground beneath the window. “How are you going to kill it?”
“I’m not,” I tell her.
The Shadow belts out one last distorted howl before launching itself into the air like a streak of night. Three flaps of its wings, and it’s gone. Vanished into the sky, lost amongst the clouds.
Maria rises to her feet. Her eyes are wide. She’s shaking, her entire body is shaking with a tidal wave of horror. “Oh no…” she mutters, gazing at the sky. “It’s gone… So many are going to die…”
“Yes,” I tell her. “I hope so.”
She turns to me then, angry. Stunned. “You told me your job was stopping those things! Hunting them! What’s the deal, asshole? Why’d you just let it fly off?”
“Because I never finished my story.”
“You’ve gotta be kidding me…”
“Project Judas had a directive,” I explain. “A very specific one. Its purpose was to assimilate hostile entities, to annihilate monsters and boogeymen, and ensure the survival of our species. Simply put, it was never made to hurt humans. After everything you’ve told me, I’m not convinced it can.”
She crosses her arms, looking at me like I’ve lost my mind. “Were you even listening to what I said? I found a fucking graveyard down there. It burned John’s skull to a crisp, cracked it open, and ate his brains. I don’t care what it was designed for– I watched it kill a human right in front of me.”
“I’m not certain you did.” I lift up my briefcase, paying my now ashen clipboard one final, farewell glance. “From everything you described, I question whether John was a man at all by the time he took you down to that bunker. If he really was Johnathan Screech, and I think the evidence points to yes, then it’s said he conducted more than a few experiments on himself along the way. The glowing eyes? I’ve never met a human with a set of those.”
“But–”
“Fact is, John brought you here to kill you. John told you that he needed to feed you to his child, that he didn’t have a choice…” My thoughts turn to all the strange disappearances that lead me here. The missing entities. The absentee urban legends. “He was feeding Judas a steady supply of horrors, just enough to keep it from entering hibernation– right up until the moment he ran out. That’s why he pulled you down there. He thought you’d be an easy mark, that maybe with a little creative twisting of the narrative, he could convince Judas that you were close enough to food.
"Remember how he kept calling you a monster? Unfortunately for John, he misunderstood his own creation. Project Judas wasn’t designed to harm human beings. It went against its core directive. So in that moment, when John offered you as a sacrifice, a flip switched in Judas that made it realize John had crossed the threshold and become a monster himself.”
She’s quiet as we walk out the door. “You think he really was that Johnathan Screech guy?”
I shrug. “Maybe. I doubt there are dental records to double check, but based on what you’ve said tonight, it wouldn’t surprise me if Screech couldn’t let his project die. A creature like Judas… The Facility probably didn’t have a means of terminating it, so they buried it instead. Sealed it behind blast doors a kilometer beneath the earth. Then they erased all records of this base ever existing.” My SUV is gleaming black, impossible to miss against the ruinous backdrop of ancient humvees. I crack the passenger door. “Need a ride?”
She smiles. It’s the first time I’ve seen her smile all night, and I can’t help but smile back. “Thank you,” she says. “For not killing me.”
“Don’t mention it.”
She clambers into the seat, and just as I’m about to close the door, she stops me. “Wait,” she says quickly. “I forgot earlier, but John mentioned another entrance. One used for freight… That’s probably how he got back into the bunker after they sealed it up. He seemed to know everything about that place.”
“Yeah,” I tell her. “I figure he must have.”
I close the door and circle to the driver's side.
“So what do we do now,” she asks as I hop in. “About that thing, Project Judas?”
"Nothing," I say, plugging the key into the ignition and giving it a twist. The engine rumbles to life. “As far as I’m concerned, that creature isn’t a monster. And that means it’s not my problem.”
The vehicle rattles as we pull out of the base and onto the jungle road. Maria twists in her seat. She looks back through the rear window as her worst memory falls further and further behind us. “If it isn’t a monster, then what it is it?” she asks.
Words drift around my head. Definitions. I’m trying to figure out how to explain what it is that she and I saw, what it is that more people will see in the coming weeks. I’m trying to think of a way to tell Maria that whatever that thing was, she doesn’t need to be afraid of it. None of us do.
I open my mouth to reply, but I’m interrupted by a microphone howl. It’s distant. Far away. I crane my head and look up through the scatter of vines passing above us. And then I see it. A dark speck on the horizon. It’s little more than a dot against the moonstreaked clouds, but I know that if it were closer, I’d see a creature with six wings. I’d see a shadow with cinderlight eyes. A body of black static.
I’d see a guardian angel– one with plenty of work to do.
submitted by Born-Beach to TheCrypticCompendium [link] [comments]


2024.04.28 23:18 ConsciousConcern901 No amount of knowledge will trump experience

As in, the part of you with all these limiting beliefs that feel difficult to overcome. That feels like you’re in a constant struggle with. The “Shadow” self.
That version of you doesn’t care about the facts of the law. It’s why you can repeat that circumstances don’t matter, feel good for a minute, but later start to fall back into the old swing of things. It cares about experience.
You’re fighting the battle with the wrong weapon.
We may be aware of limiting beliefs, but we tend to internalize them after an experience, and stamp them into our identity. Because experiences tend to become memories, and memories are tied to specific emotions that trigger immediately into our body when we think of them.
Using the logic of the law, and acquiring knowledge is only 1/2 of the work. Your “shadow” self fears reliving the experience that caused you to have the negative limiting beliefs. It has knowledge of the law yes, but it also has knowledge of the 3D, and the experience of the 3D to counter it.
The brain can’t tell between memory, or imagination. It can’t tell if an immersive imagination experience is real or not. The emotions and body reacts all the same.
Therefore, your “shadow” self, the limiting beliefs you. Doesn’t actually care about the facts of the 3D or knowledge of the law, what it cares about is the experience.
If you feel unworthy of love. Reading “worthiness doesn’t matter.” Won’t do much if you have the experience and feeling of unworthiness in your state of being. You need to go into the 4D and create the experience of being worthy, feeling that you are worthy of love to craft the memory. Your brain will believe it regardless.
This goes for anything.
People will go into their imagination and imagine their desire in a way that almost subconsciously allows them to view their desire. But avoid the confrontation of the limiting Belief.
You can imagine yourself & your sp being together and eating at a restaurant. But how does that resolve you feeling unworthy? It only addresses the “having.” but for as long as you feel unworthy you won’t actually have it?
(Also to note, people have things all the time but their feelings get in the way. People date someone yet still feel unworthy. And the relationship suffers because of that. This goes for anything.)
Or maybe you want to be a famous athlete. But you’ve been bullied about your skills. You know you’re better then you’re peers, but feel like an imposter amongst the professionals.
No amount of “skills are irrelevant, circumstances don’t matter.” Will convince your “shadow” if you don’t have any experience to back it up?
Same thing, some people may imagine themselves as the athlete doing interviews, or anything not related to skill set. As an unconscious way to avoid addressing the limiting belief part of them. Your desires this way will always feel so close yet so far, because you’re just touching the surface.
There is no difference between a memory and imagination. So you can just create new memories.
Carl Jung said “Until you make the unconscious conscious, it will direct your life and you will call it fate.”
This doesn’t apply to everyone, but, if you’re struggling maybe this may apply to you. Yes, some people didn’t need to do this, but, it doesn’t really matter does it?
Give yourself the experience that you would need to recover from the limiting belief. As Neville points out, the law of reversibility is ingrained in the law. If a belief gets solidified from experience, then creating an experience will solidify your new belief that you desire to obtain.
submitted by ConsciousConcern901 to NevilleGoddard2 [link] [comments]


2024.04.28 23:18 xBotvernor [xGov-195] Frostbits Solutions Arcpay - Frostbits Solutions

title Frostbits Solutions Arcpay
id 195
period 4
author Wilder Stubbs (@WilderStubbs)
email wilder@frostbits.solutions
discussions-to https://forum.algorand.org/t/xgov-195-arcpay/11843
company_name Frostbits Solutions
category dApps
focus_area Banking
open_source Yes
funding_type Proactive
amount_requested 149000
usd_equivalent $29,569.05 (note: automated conversion)
delivery_date 2024-08-31
status Final

Abstract

Arcpay is a plug and play open-source payment processing platform to easily transact on the Algorand blockchain. Arcpay empowers businesses to receive Algorand tokens as payment simply and build advanced workflows for use cases such as community engagement through loyalty point redemption, subscription payments, e-stores, etc.
Arcpay is meant to be a gateway to the Algorand ecosystem that provides a streamlined path for traditional web-based projects and removes barriers to entry through an easy to use web interface and SDK. Arcpay will support network tokens and all Algorand Standard Assets (ASAs).

Team

Our team is composed of seasoned professionals with extensive experience in blockchain technology. This background provides a robust foundation for addressing the unique challenges of integrating Web2 with Algorand Virtual Machine (AVM) technologies, ensuring Arcpay's seamless functionality across diverse digital environments.
Expertise and Roles:

Experience with Algorand

The Frostbits Solutions team has been contributing to the Algorand ecosystem since 2021. Notably with Algogems, an Algorand platform known for its pioneering work in NFT creation and trading. We have a strong technical understanding of AVM technologies!
Algogems NFT Marketplace

Present Proposal

Problem Description

Integration of Algorand technology into existing applications presents significant technical and operational challenges, hindering broader adoption and confining blockchain applications largely to niche markets. Common barriers include the need for specialized knowledge, security concerns, and integration complexities.
Aligned with Algorand's vision of global decentralization, scalability, and security, Arcpay introduces a seamless gateway to the ecosystem and provides a streamlined path for businesses to transact on the blockchain. Use cases include B2B and B2C payments, loyalty point management and redemption, and removing intermediaries for improved operating margins. This will make Algorand an appealing and cost-effective choice for businesses entering the competitive Web3 ecosystem.

Solution Approach

Arcpay will bridge traditional applications and blockchain by providing:

Deliverable

Technical Approach

Arcpay is built on principles of flexibility and user-friendliness, with a commitment to open-source development and community collaboration. Security strategies include community-driven vulnerability assessments and periodic expert reviews if financially viable, ensuring a reliable and secure platform. The Algorand native solution will build upon and refine the Arcpay platform created for Voi Phase1 testnet.

Concerns & Risks

Concerns Mitigation Strategies
---------------------------------
Technical Challenges Incremental Deployment: Roll out features in stages to manage complexity and allow for focused testing and optimization at each step.
Community Beta Testing: Prior to official release, conduct extensive beta testing with community members to ensure real-world applicability and robustness.
Bugs and Broken Features Ongoing Maintenance Post-Launch: Continue to address and rectify any issues that arise post-launch on the mainnet, ensuring the platform remains reliable and secure.
Adoption Rates Partnerships: Form strategic partnerships with key players within the Algorand ecosystem to leverage their networks and gain credibility.
Hosting Sustainability Revenue Model Implementation: Develop and implement a clear revenue model to support long-term hosting and operational costs.
Performance Monitoring: Regularly review performance metrics to ensure that hosting remains financially viable and adjust strategies as needed.
Documentation and Community Support Documentation: Develop comprehensive, easy-to-understand user and developer documentation.
Community Channels: Establish and maintain active community support channels, including forums, live chat support through Telegram, X, and Discord.
Feedback Loops: Implement structured feedback loops to continuously improve documentation and support based on user input.
Ongoing Maintenance A 4-week review and validation phase per milestone launch, followed by ongoing maintenance, ensures platform reliability.

Future Blueprint

Project Longevity

Designed for sustained growth, Arcpay will evolve with blockchain advancements and community feedback. Supported on both the Algorand testnet and mainnet, our platform aims for long-term operational sustainability through a clear revenue model. Our commitment extends to maintaining the platform for at least one year after final delivery, with intentions to continue as long as it remains economically feasible.

Project Timeline

Throughout the project, we will actively gather and incorporate feedback via the Algorand Forums, social media, and messaging platforms ensuring that Arcpay not only meets but exceeds community and user expectations.

Milestones

Milestone ETA Scope
-----------------------
Milestone 1 - MVP 4 weeks - ASA to ASA sale contract
- End-to-end UI to create sale and interact with contract (Sign In, User dashboard to create, edit, and list NFTs and payment modal)
- Link to assets
- Secondary market sales listings and settings
- Integration through direct links
- Development kit to interact with the listings
- Documentation
Milestone 2 6 weeks - ALGO to ASA sale contract
- ASA to ASA English Auction contract
- ASA to ASA Dutch Auction contract
- ALGO to ASA English Auction contract
- ALGO to ASA Dutch Auction contract
- Secondary market auctions listings
- Updated UI for Auctions
- Updated Development kit for Auctions
- Updated documentation
Milestone 3 - Final Delivery 8 weeks - ALGO to RWAs sale contract
- ASA to RWAs sale contract
- User dashboard with statistics and settings
- API Subscription page
- Updated UI for RWAs sale
- Updated Development kit for RWAs sale
- Network switch
- Updated documentation

Benefits for the community

Expected Impact & Outcomes for the Algorand Community

Define Success

Success for Arcpay will be defined by the adoption rate among businesses and P2P users, the volume of transactions processed through the smart contracts, and positive community feedback regarding its impact on the ease of integration.
We envision Arcpay’s success playing out in two sequential stages:
  1. Adoption by top Algorand players such as top NFT projects, Wallets, and existing dapps.
  2. Expansion to traditional businesses and non-Algorand native Web3 projects.

Additional information

Contact Information

This proposal is supported by Frostbits Solutions, a Canadian company specialized in custom, cutting-edge software, that leverages decentralized networks to facilitate rapid, secure, and cost-effective exchanges of digital goods.
Website: frostbits.solutions
Email: wilder@frostbits.solutions
Twitter: @WilderStubbs
Discord: Wilder

Github Links

Proposal on Github
Conversation and changes can be found in the top comments.
submitted by xBotvernor to xGov [link] [comments]


2024.04.28 23:17 Wednesdaynevermore My classmate claimed she could shape-shift. 20 years later and I still believe her.

I’ve spent 2/3 of my life trying to debunk this one or come up with a reasonable explanation.
When I was in 5th grade I rode the bus with a girl who was a grade or two below me. We lived on opposite sides of a rather large apartment complex but could still walk to each others’ homes. Let’s call her Mika. Mika was born in Romania and had moved here a few years prior to me meeting her. She had heterochromia. Mika had brown eyes, but half of one of her irises was blue. This is relevant later. Her family was…off. I went over to her apartment a few times to hang out with her and remembered seeing a lot of unusual decor in her home. Lots of candles, a few alters, and a lot of taxidermy. Her mother would get very upset anytime she brought friends over, so typically she would sneak myself and a few of our other friends in anytime her mom was out shopping. My mom was pregnant at the time and often felt sick so I never brought my friends over unless she was out shopping or at the doctor’s. She had never met Mika.
A lot of the other kids were scared of her or felt uneasy around her. None of our mutual friends could explain why. I did too at times for some unknown reason, but as a fellow outcast we stuck by each other’s sides.
After a few months of knowing each other she started telling me that her family followed a belief system similar to paganism. I forget exactly what she called their religion, but it wasn’t Wicca/Pagan or any other “common” (for lack of a better word) spiritual belief system. Mika started claiming she had special powers. I immediately called bull and told her to prove it. She responded by saying that she would shape-shift into a cat and visit me later that evening.
My family and I lived on the third floor of our apartment building. There were no fire escapes, balconies, or anything like that. We barely even had window ledges.
As a side note, I loved cute stuff. Hello kitty, Care Bears, My little pony, all that good stuff. My “best friend” at the time was very into fashion and loved to act like a high schooler. The few times she saw me in cartoon PJs she’d make jokes. So, I stopped wearing anything like that to school or sleepovers almost a full year prior to me meeting Mika.
I went to bed at my normal bedtime of 9:30 wearing some care bear pajamas that my mom had bought me a week prior. Nobody had seen me in them aside from my parents. Around 3am that night, I woke up to hear scratching at my window. I jumped out of bed, looked out my window, and saw a black cat staring directly at me. The cat was sitting on the very thin window ledge and raised its’ paw at me as if it was waving at me. Freaked out, I told myself it was a dream and went back to sleep.
That morning while my mom was cooking breakfast, she asked if I had heard anything weird outside the night before. I asked what she meant.
“I saw the weirdest thing last night. I heard scratching at my window. I got up to see what it was and saw a black cat on the window ledge! It looked like it was trying to open my window!”
I sat there in silence unsure of what to say. My mom continued.
“I was afraid it was going to fall, so I brought it inside, carried it downstairs and put it outside. You should have seen its eyes! It had one eye that was half blue. I’ve never seen a cat like that before…”
I felt so sick that I couldn’t speak or finish my breakfast. Not wanting to approach Mika after all this, I stood off to the side away from all the other kids until she walked up to me.
“Hey Gloomy Bear, why didn’t you let me in last night?”
I was stunned.
“I could have died! At least your mom was nice enough to take me downstairs.“
I stopped hanging around her as much after that. Our building caught on fire a few weeks after all this happened and we ended up moving a few towns away. I haven’t seen or heard from her since. To this day my mom still remembers the cat. She’ll mention it in passing every few years and make comments about how weird it was that a cat could have gotten up on that ledge.
For the last 20 years I’ve been racking my brain looking for an explanation. If anyone has one, I’d love to hear it.
submitted by Wednesdaynevermore to Thetruthishere [link] [comments]


2024.04.28 23:14 AckAttack6710 Recently started Adderall. Brain was finally quiet for the first time in my life. But that feeling never happened again?

I was diagnosed as an adult, and in late Feb I got prescribed 5mg Adderall IR to take "as needed" from my GP. She said take one, if it feels like nothing changed in 45 minutes, take another. First couple days, one did nothing at all, so I double checked and she said I was okay to try two at a time, so I did.

That first weekend was magic. All the background noise stopped and if I wanted to do something, I could just... do it. I got a new prescription for 10mg IRs "as needed" once the 5s ran out, and 20mg XRs to see if I liked them. The XRs are weird. I feel different, but not in a good way necessarily, I think? Not the same as the IRs for sure. But with both medicines, I still don't have that quiet feeling I got in the first week. (side note - is medicine really all trial and error? Is there no other way?)

Is my quiet brain that the "euphoric high" everyone talks about? Is that not a realistic goal and I should accept what I have? Symptoms are okay when on 10mg IRs, but my head isn't quiet, which was really really nice.

I spoke with my GP and my psychiatrist, but neither seem to fully grasp what I'm asking about. I was hoping you wonderful people would and could offer some insight, potentially.
submitted by AckAttack6710 to ADHD [link] [comments]


2024.04.28 23:10 Daurakin Overclock suggestions - Part 2: Engineer

Overclock suggestions - Part 2: Engineer
----- ENGINEER OVERCLOCKS -----
-- "Warthog" Auto 210 --
.Warthog OC: Pump Action
Positives:
  • 2x damage and +1 pellets.
  • 0,75x spread.
  • +1 penetration.
Negatives:
  • Requires a pump-animation after each shot (same animation as the second half of the reload-animation) which reduces rate of fire a lot.
  • -2 magsize.
  • 2x recoil.
  • 0,622x ammo

Warthog OC: Hellboar Shells
Positives:
  • Adds 200% heat power, allowing it to ignite enemies (no conversion, just addition!).
  • +3 penetration.
  • Each pellet has a bigger hitbox.
Negatives:
  • -1 damage.
  • +20% spread.
  • No longer hitscan, instead having fiery projectiles with (still pretty fast) travel time. Furthermore, the projectiles dissipate after 30 meters, limitting the attack range.

Warthog OC: Plasma Coated Pellets
(Credit to u/uwuGod for this OC idea!)
Positives:
  • Pellets can now ricochet off enemies and surfaces alike (not with homing properties, mind you). Can bounce 2 times per pellet.
  • 1,5x pellets.
  • +400% armorbreak bonus.
  • Can now stun on non-weakpoint hits as well.
  • Each pellet has a bigger hitbox.
Negatives:
  • -1 damage.
  • 0,833x ammo.
  • Pellets can no longer hit weakpoints.
  • -1 magsize
  • Pellets now have traveltime, instead of being hitscan projectiles.
-------------------------------------------------------------------------------------------------------------------------------------------
-- "Stubby" Voltaic SMG --
Stubby OC: Nanobot Explosives
Positives:
  • Bullets explode with a 3 meter radius AoE explosion upon reload or after 4 seconds of impact, including shots placed on terrain. Damage of explosion is equal to the modded bullet damage, but as explosive damage. Explosions can electrocute everything in their area of effect, with each explosion having the same chance to electrocute as direct hits have.
  • 0,67x reloadtime.
Negatives:
  • Direct hits no longer deal any kinetic damage (but can still electrocute!)
  • 0,5x magsize.
  • 0,75x ammo

Stubby OC: Laserblast EM Tuning
Positives:
  • Positive: 3x damage.
  • 0,5x spread.
  • Its shots slow down struck enemies by 25% for 2 seconds.
  • Alteration: Its electrocution effect is converted to enemy heating. 1% electrocution chance becomes 1 heat power per shot (So the tier 1 Upgraded Capacitors mod doubles the heat power per shot from 25 to 50).
  • Alteration: The tier 5 Electric Arc mod is changed in behaviour; on enemy hit, 25% of the heat power is also applied in an area around the struck target, with a 2,75 meter radius.
Negatives:
  • Semiauto trigger
  • 0,67x rate of fire.
  • 1,25x vertical recoil.
  • 0,4x magsize ammo.
  • 0,4x ammo.
-------------------------------------------------------------------------------------------------------------------------------------------
-- LOK-1 Smart Rifle --
LOK-1 OC: Killer Gaze
(Credit to u/uwuGod for this OC idea!)
Positives:
  • Alteration: The lock-on laser deals damage when it locks on to an enemy, dealing Disintegrate + Fire damage continously (equal to original bullet damage). More locks = More damage and ammo drain, in a proportional manner.
  • Positive: 1,5x ammo.
Negatives:
  • Can no longer shoot bullets on trigger release, which also means no capability to hit weakpoints, nor can you hit environmental things such as Gunk Seeds etc.
  • 0,67x lock-on width.
  • 0,8x lock-on range.
Notes:
  • Some mods would work slightly differently, for example:
  • T3A (Electro-Chemical Rounds) - This would update and apply its damage bonus(es) for each damagetick on currently locked on targets, as soon the target(s) becomes ignited/electrocuted.
  • T3B (SMRT Targeting Software) - This would instead constantly update and change its current locks to other targets as soon as an enemy with locks on them dies, removing the need to release-and-hold the trigger to acquire locks on new incoming targets.
  • T3C (Super Blowthrough Rounds - This causes each visual lock-on beam to also hurt 1 other enemy that might be touching it (!)
  • T5A (Electric Generator Mod) - This would apply the electrocution immediately on a target when it gets 3+ locks on it.
  • T5B (Unstable Lock Mechanism) - This would update and apply its damage bonus for each damagetick on currently locked on targets - meaning, as soon you reach the full amount of lockons, all currently attacked enemies will suffer more damage per tick.
  • T5C (Fear Frequency) - This still triggers its fear effect upon releasing the trigger, but its fear effect is calculated from the highest amount of locks that was acquired at any time during the time the trigger was held.

LOK-1 OC: Whisperwind Burst
Positives:
  • Bullets will now bounce among all locked-on targets.
  • Instantly puts 3 locks on each target it locks on to.
Negatives:
  • Cannot put neither more nor less than 3 locks on each seperate enemy it locks on to.
  • 1,5x lock-on speed. Note that even if this downside slows down the overall lock-on process, this OC is still quite fast to lock on to targets due to always instantly applying 3 locks on each target (as described in the positive points).
  • Will always shoot exactly 3 bullets after locking on to targets - nothing more, nothing less - regardless of how many targets are locked on to. It will also always shoot all 3 bullets, even if the target(s) dies from just 1 bullet each. (Note: While this is a "downside", it can technically also be seen as more of an upside, due to the 3 bullets bouncing among all the different targets, reducing potential bullet-waste to some degree).
  • 0,5x ammo.
  • 0,5x magsize.
-------------------------------------------------------------------------------------------------------------------------------------------
-- Deepcore 40mm PGL --
PGL OC: Artillery Rounds
Positives:
  • The further the grenade travels, the bigger its explosive radius and the stronger the explosion becomes. Max bonus: +150% radius, +150% damage.
  • When a grenade is in flight, a gauge-indicator will pop up next to the crosshair, showing how much of a bonus it currently has gained.
  • Press reload while a grenade is in flight to send it straight down to the ground with highly increased speed.
  • When a grenade is in flight, an indicator will be shown on the ground, pinpointing where the grenade would land if you decide to send it down with the reload-button.
Negatives:
  • Base radius is reduced by 0,8 meters.
  • Base damage is reduced by 30.

PGL OC: Thunderbird Bombs
Positives:
  • Explosions also electrocute enemies.
  • 1,5x projectile speed.
Negatives:
  • -25 explosion damage.
  • +0,5 second reloadtime.
  • 0,5x armorbreak.
  • No longer fears enemies.
-------------------------------------------------------------------------------------------------------------------------------------------
-- Breach Cutter --
Breach Cutter OC: Compressed Ion Blades
Positives:
  • +24 ammo.
  • +3 magsize.
  • +1 rate of fire.
  • -0,6 second reload time.
Negatives:
  • 1,33x projectile speed (this is a negative for DPS, but positive for max distance and accuracy).
  • 0,5x beam width.
  • 0,67x contact DPS (Note, the 50 fire damage dealt on first contact is not reduced)
-------------------------------------------------------------------------------------------------------------------------------------------
-- Shard Diffractor --
Shard Diffractor OC: Pulse Force
Positives:
  • The reload-button winds up a new attack, a "Force Shot". The Force Shot requires a brief windup before shooting (same as the regular attack), or can be fired immediately if already holding down the primary fire button. When shot, a conical wave of energy is blasted in front of you (7 meter forward reach). This shot consumes 25 ammo, but deals 16x the weapon's normal area damage in the entire cone, along with stunning all targets struck for 4 seconds. After firing, it requires the gun to recharge its magazine (same as if letting go of the trigger with the regular firing mode).
  • Regular firing mode's damage is increased by 6x (both direct and area damage).
  • Regular firing mode's heat generated on enemies is also 6x as much.
Negatives:
  • The regular firing mode's beam is no longer a continuous one. Instead, it "pulsates" in a slower manner (see rate of fire below).
  • 0,2x rate of fire.
  • Regular firing mode costs 5 ammo per shot.
  • +0,2 second windup time (this affects both the regular firing mode, and the added "Force Shot"-attack from the reload-button).

Shard Diffractor OC: Riot Beam
Positives:
  • +0,6 meter radius for the area damage.
  • Wind-up time removed. Meaning, you will fire as soon as you press the trigger (as long as the magazine is charged to full, of course).
submitted by Daurakin to DeepRockGalactic [link] [comments]


2024.04.28 23:08 MedicinalPsycho a story idea thats been bouncing around my mind.

this is mainly to see if my idea has any merit, ive been sitting infront of white pages too long and dont know if its worth realy getting into writing these are my most finished parts because my ADHD has bounced back on them the most if you want more i can see about dropping what bits and pieces i have in the discussion
chapter 1: As Elise and Atlas navigated the dense forest, a sudden tearing noise sliced through the air, chilling them to the bone. Startled, Atlas grasped Elise's shoulder, his expression filled with curiosity. he suggested investigating the source of the disturbance. Pushing through the underbrush, they stumbled upon a peculiar tear in the fabric of space, revealing a group of otherworldly creatures.
One of the creatures emerged fully, its gaze fixating on the two children. With dread sinking in, Atlas acted swiftly, shoving Elise out of harm's way as the creature lunged towards them. The impact sent Atlas reeling, and he found himself being dragged towards the tear. Frozen in fear, Elise watched helplessly until a surge of determination coursed through her.
Ignoring her trembling hands, she seized her basket of berries, smashing it over the creature's back. In response, the creature turned its attention to her, striking her with a forceful blow. As Atlas regained his footing, he armed himself with a sickle, positioning himself defensively between Elise and the creature.
Their desperate struggle continued, with Atlas narrowly avoiding the creature's attacks. However, a moment of distraction proved costly as the creature struck Atlas, sending him crashing into a tree. Elise's cries pierced the chaos as she witnessed her brother's fall, her heart pounding with terror.
The creature, undeterred, focused its aggression on Elise, inflicting a devastating blow that fractured her leg. With a firm grip on her, it began dragging her towards the rift, leaving Atlas helpless to intervene. In his final moments of consciousness, Atlas glimpsed a glimmer of hope as an arrow pierced the creature's neck, and darkness descended upon him.
Saved by the hunters from their village, Elise and Atlas survived the harrowing encounter, the echoes of the tearing sound fading into the night, leaving behind a haunting reminder of the dangers lurking in the shadows.
chapter ?: As the fight against the mages raged on, Atlas could feel his body weakening. He had already taken down three of the giant ogre-like mages, but he could feel his magical reserves dwindling. Meanwhile, Elise was holding her own against her three opponents, but the mages were proving to be formidable adversaries.
Suddenly, one of the mages broke away from the fight and flew towards the party members, preparing a powerful magic artifact. Atlas knew that he had to act quickly, but he was blocked by the second mage, who used wind magic to throw him back towards Elise. But Elise was quick on her feet and used her own wind magic to redirect Atlas into the blind spot of the third mage.
Atlas took advantage of the situation and decapitated the third mage with his sword. But before he could make his way to the mage with the artifact, it had already activated. Atlas knew that he couldn't stop the attack, so he threw himself between the artifact and his party members.
As the attack hit, Atlas felt his body disintegrating. But he could hear a voice in the back of his head, urging him to get out of there. It was the voice of his sister, who had always been protective of him. But Atlas ignored her plea and instead imagined all of his strength flowing into her.
As he closed his eyes, he saw a dim golden aura surrounding him. The aura mixed with the red and black light from the attack, and Atlas's spirit absorbed the massive spell. He defied the laws of the universe and directed all of the power into one thought: "Elise better make it out of this alive."
In that moment, Atlas sacrificed himself for the sake of his party members and the world. But his spirit lived on, infused with all the love and compassion he had for his sister.
chapter ?: Elise walked slowly towards the pile of dust where her brother once stood. She kneeled down and reached out, picking up a small metal plate with an inscription from Atlas. "I saw death yesterday on my watch, I know I won't make it out of this fight," Elise read out loud, her voice breaking slightly. "I know you will hurt for a long time, but I don't want you to get lost. Become the pillar for the future, find a husband, and please carry on. I love you - Atlas."
Elise closed her eyes and held the metal plate close to her chest. Tears didn't come, but her heart felt shattered. She stood up in a daze and began walking with a blank expression, passing through the many battlefields that had taken the lives of her party members.
As she walked, a small gremlin monster suddenly jumped out and stabbed her in the shoulder, the point of the dagger poking out the Front, its tip glistening red. Elise turned slashed the creature's throat, still with the dagger lodged in her shoulder.
For three days and nights, Elise walked without stopping until she finally arrived at the capital, dagger still in her shoulder. The guards at the gate tried to stop her, but they froze up when they met her eyes and let her pass. Elise then made her way to the palace where the royal guards tried to stop her but also let her pass when they realized who she was.
Elise finally stood before the king and explained what had happened in the fight and how the others had fought honorably. Only after her audience with the king and when she returned to her and Atlas's room did she finally let her emotions out, falling to her knees and crying uncontrollably.
Days passed, and Elise didn't leave her room. She remained isolated, feeling the pain of losing her brother and the others. However, one day, she decided to leave the room and go for a walk around the city. As she walked, she heard someone calling her name.
its still a work in progress, my ADHD mind jumps from scene to scene without properly finishing any. the story focuses of Elise and Atlas as there world changes, the "main" world is like our medieval age knights kings and farmers and another world with the slight change that it has magic, and the creatures from that world studied magic instead of science and are now having issues with resources and land, instead of looking up to the stars and going to other planets they use magic to find another earth and are trying to invade it
all thats gets explained in 3 or so half finished chapters, but the main story is gona be Elise and Atlas learning to make use of the limited magic that seeps into their world when the rifts open and trying to stop the creatures from destroying their world, creating adventurer guilds and teaching magic/ how to fight these creatures, in a fight to defend the capital Atlas intercepts a massive spell, and through a way i have yet to properly explain grants his sister immortality,
submitted by MedicinalPsycho to fantasywriters [link] [comments]


2024.04.28 23:02 MedicinalPsycho [f] a story thats been bouncing around my head for awhile

chapter 1: As Elise and Atlas navigated the dense forest, a sudden tearing noise sliced through the air, chilling them to the bone. Startled, Atlas grasped Elise's shoulder, his expression filled with curiosity. he suggested investigating the source of the disturbance. Pushing through the underbrush, they stumbled upon a peculiar tear in the fabric of space, revealing a group of otherworldly creatures.
One of the creatures emerged fully, its gaze fixating on the two children. With dread sinking in, Atlas acted swiftly, shoving Elise out of harm's way as the creature lunged towards them. The impact sent Atlas reeling, and he found himself being dragged towards the tear. Frozen in fear, Elise watched helplessly until a surge of determination coursed through her.
Ignoring her trembling hands, she seized her basket of berries, smashing it over the creature's back. In response, the creature turned its attention to her, striking her with a forceful blow. As Atlas regained his footing, he armed himself with a sickle, positioning himself defensively between Elise and the creature.
Their desperate struggle continued, with Atlas narrowly avoiding the creature's attacks. However, a moment of distraction proved costly as the creature struck Atlas, sending him crashing into a tree. Elise's cries pierced the chaos as she witnessed her brother's fall, her heart pounding with terror.
The creature, undeterred, focused its aggression on Elise, inflicting a devastating blow that fractured her leg. With a firm grip on her, it began dragging her towards the rift, leaving Atlas helpless to intervene. In his final moments of consciousness, Atlas glimpsed a glimmer of hope as an arrow pierced the creature's neck, and darkness descended upon him.
Saved by the hunters from their village, Elise and Atlas survived the harrowing encounter, the echoes of the tearing sound fading into the night, leaving behind a haunting reminder of the dangers lurking in the shadows.
chapter ?: As the fight against the mages raged on, Atlas could feel his body weakening. He had already taken down three of the giant ogre-like mages, but he could feel his magical reserves dwindling. Meanwhile, Elise was holding her own against her three opponents, but the mages were proving to be formidable adversaries.
Suddenly, one of the mages broke away from the fight and flew towards the party members, preparing a powerful magic artifact. Atlas knew that he had to act quickly, but he was blocked by the second mage, who used wind magic to throw him back towards Elise. But Elise was quick on her feet and used her own wind magic to redirect Atlas into the blind spot of the third mage.
Atlas took advantage of the situation and decapitated the third mage with his sword. But before he could make his way to the mage with the artifact, it had already activated. Atlas knew that he couldn't stop the attack, so he threw himself between the artifact and his party members.
As the attack hit, Atlas felt his body disintegrating. But he could hear a voice in the back of his head, urging him to get out of there. It was the voice of his sister, who had always been protective of him. But Atlas ignored her plea and instead imagined all of his strength flowing into her.
As he closed his eyes, he saw a dim golden aura surrounding him. The aura mixed with the red and black light from the attack, and Atlas's spirit absorbed the massive spell. He defied the laws of the universe and directed all of the power into one thought: "Elise better make it out of this alive."
In that moment, Atlas sacrificed himself for the sake of his party members and the world. But his spirit lived on, infused with all the love and compassion he had for his sister.
chapter ?: Elise walked slowly towards the pile of dust where her brother once stood. She kneeled down and reached out, picking up a small metal plate with an inscription from Atlas. "I saw death yesterday on my watch, I know I won't make it out of this fight," Elise read out loud, her voice breaking slightly. "I know you will hurt for a long time, but I don't want you to get lost. Become the pillar for the future, find a husband, and please carry on. I love you - Atlas."
Elise closed her eyes and held the metal plate close to her chest. Tears didn't come, but her heart felt shattered. She stood up in a daze and began walking with a blank expression, passing through the many battlefields that had taken the lives of her party members.
As she walked, a small gremlin monster suddenly jumped out and stabbed her in the shoulder, the point of the dagger poking out the Front, its tip glistening red. Elise turned slashed the creature's throat, still with the dagger lodged in her shoulder.
For three days and nights, Elise walked without stopping until she finally arrived at the capital, dagger still in her shoulder. The guards at the gate tried to stop her, but they froze up when they met her eyes and let her pass. Elise then made her way to the palace where the royal guards tried to stop her but also let her pass when they realized who she was.
Elise finally stood before the king and explained what had happened in the fight and how the others had fought honorably. Only after her audience with the king and when she returned to her and Atlas's room did she finally let her emotions out, falling to her knees and crying uncontrollably.
Days passed, and Elise didn't leave her room. She remained isolated, feeling the pain of losing her brother and the others. However, one day, she decided to leave the room and go for a walk around the city. As she walked, she heard someone calling her name.
its still a work in progress, my ADHD mind jumping from scene to scene without properly finishing any the story focuses of Elise and Atlas as there world changes, the "main" world is like our medieval age knights kings and farmers and another world with the slight change that it has magic, and the creatures from that world studied magic instead of science and are now having issues with resources and land, instead of looking up to the stars and going to other planets they use magic to find another earth and are trying to invade it
all thats gets explained in 3 or so half finished chapters, but the main storys is gona be Elise and Atlas learning to make use of the limited magic that seeps into their world when the rifts open and trying to stop the creatures from destroying their world, creating adventurer guilds and teaching magic/ how to fight these creatures, in a fight to defend the capital Atlas intercepts a massive spell, and through a way i have yet to properly explain grants his sister immortality,
submitted by MedicinalPsycho to story [link] [comments]


2024.04.28 23:01 EARTHB-24 Decoding Technical Analysis, ASI

The Accumulative Swing Index (ASI) is a technical analysis indicator used to assess the long-term trend strength of a financial instrument, typically applied to price charts such as those in stocks, commodities, or currencies. Developed by J. Welles Wilder, the ASI incorporates price action, previous highs/lows, and open/close prices to calculate a cumulative value that reflects the overall trend direction.
Key points about the Accumulative Swing Index (ASI):
1. Calculation: The ASI is calculated by first determining the Swing Index (SI) for each period. The Swing Index is calculated based on the current period’s high, low, open, and close prices, as well as the previous period’s close. Then, the Swing Index values are cumulatively added to generate the ASI. 2. Trend Strength: The ASI is used to assess the strength and direction of the prevailing trend. A rising ASI indicates bullish strength, suggesting that buying pressure is outweighing selling pressure, while a falling ASI suggests bearish strength, indicating that selling pressure is dominating. 3. Zero Line: The ASI typically oscillates around a zero line. Values above zero indicate a bullish trend, while values below zero suggest a bearish trend. The magnitude of the ASI value can also provide insights into the strength of the trend, with larger positive values indicating stronger bullish momentum and larger negative values indicating stronger bearish momentum. 4. Divergence: Traders often look for divergences between the ASI and price action as potential signals of trend reversal or continuation. For example, if the ASI is making higher highs while prices are making lower highs, it may indicate weakening bullish momentum and potential for a trend reversal. 5. Uses: The ASI can be used in conjunction with other technical indicators and chart patterns to confirm trend direction, identify potential entry and exit points, and assess the overall health of a trend. It is particularly useful for traders and investors who prefer longer-term analysis and want to filter out short-term noise in the market. 
In summary, the Accumulative Swing Index (ASI) is a technical analysis tool used to evaluate the strength and direction of long-term trends in financial markets. By cumulatively adding Swing Index values over time, the ASI provides a smoothed representation of trend strength and helps traders identify potential opportunities in the market.
submitted by EARTHB-24 to growthman [link] [comments]


2024.04.28 22:59 Sufficient_Spirit619 How far into the future is this?

Imagine nanobots that attach themselves to the brain (the visual cortex in this example) and project an augmented reality interface over our vision. Then bring able to control this UI using just our thoughts. Than the ability to connect to the internet, make calls and even download skills directly to/through our brain.
submitted by Sufficient_Spirit619 to nanotechnology [link] [comments]


2024.04.28 22:57 The_Fool_Arcana0000 Bayonetta Concepts I: “Corrupted Sin Demons”

Note: All of this is non-canon lore I created to expand upon ideas from the series.
What would have happened if Singularity had damaged Cereza's umbran watch before she used the Deadly Sin Ritual? How would it affect her and her contracted Demon?
A summoner on the brink of death and a near-unstoppable monster would be the outcome...
Corrupted Sin Demons are the direct result of a failed Deadly Sin Ritual, usually caused by a(n): incorrect incantation, insufficient spiritual energy, disrupted connection, and external forces. If any of these factors were to come to pass, the ritual would not only fail but disrupt the balance within the Trinity of Realities.
For example, suppose that Cereza desperately attempted to summon Sin Gomorrah with a broken watch. Their connection would be severely strained and on the brink of collapse, which causes Gomorrah to be summoned with severe abnormalities to its abilities, body, and personality. This new form may be considered a "Dark Evolution" and grants Gomorrah a new title: "Genesis of Atrocities".
Moreover, said form is severely stronger than Gomorrah's true Sin form, enough to destroy a large country if given time, and to the point that they're uncontrollable without direct use of the Left Eye. Even accessories meant to prevent Infernals from becoming enraged break upon attempts to control.
And so, the end result is an unhinged, rampaging monster that can only be stopped if killed or stalled long enough for its magic to be fully depleted...
However, a corrupted Sin Gomorrah, alone, is not the only issue. There is also the matter of the portal used to summon it. Normally, the ritual has the summoner spend a fixed amount of magic (usually in large quantities) to open a portal to Inferno and summon forth the Demon of their choosing.
This is not the case for a failed Deadly Sin Ritual.
When it fails, nearly all of the summoner's magic is drained and unequally distributed onto the Sin Demon and the portal used to summon it. The large influx of magic not only forces the portal to maintain its existence and the size used to pull the Infernal through but allows for any Infernal to wander through.
It will eventually vanish once the magic used to sustain it runs out...
submitted by The_Fool_Arcana0000 to Bayonetta [link] [comments]


2024.04.28 22:47 HumbledIdiot Trying to define formations mathematically and determine individual participant coordinates

Hi askmath, I'm working on a video game prototype as a hobby/passion project that will likely never see the light of day. But the concept of the game has to do with a pvp drone swarm combat. The issue I am having has to do with drone formations. My initial concept is:
Select the desired number of drones, command them to a formation. For each drone in the selection they are given an index. The index of the drone is used to then determine it's location within the formation. In order provide the player with the ability to create their own formations I had the idea to determine drone position by equation instead of static values. Here is the pair of functions(in godot 4 gdscript) that determines them as it stands today(which I am highly suspicious that I have gone down the wrong path with).
(I selected the Funcitons flair because I thought it was kind of a x,y,z = f(index), but mods please change it if it's not appropriate, or let me know how to if I can? Haven't ever really used reddit before past being a lurker without an account)
func get_location_for_index(index):
return Vector3(calculate_equation(index, x_location_equation_type, x_location_equation_scalar, x_location_equation_offset, x_location_equation_wrap_divisor),
calculate_equation(index, y_location_equation_type, y_location_equation_scalar, y_location_equation_offset, x_location_equation_wrap_divisor),
calculate_equation(index, z_location_equation_type, z_location_equation_scalar, z_location_equation_offset, x_location_equation_wrap_divisor))
func calculate_equation(index, equation_type, equation_scalar, equation_offset, equation_wrap_divisor):
var result = 0
if equation_type == "liniear":
result = (index * equation_scalar) + equation_offset
elif equation_type == "quadratic":
result = ((index * index) * equation_scalar) + equation_offset
elif equation_type == "cubic":
result = ((index * index * index) * equation_scalar) + equation_offset
elif equation_type == "sin":
result = (sin(index) * equation_scalar) - equation_offset
elif equation_type == "cos":
result = (cos(index) * equation_scalar) - equation_offset
elif equation_type == "tan":
result = (tan(index) * equation_scalar) - equation_offset
elif equation_type == "logarithmic":
result = (log(index) * equation_scalar) - equation_offset
return result
The Vector3 returned in the first function is just the x,y,z coordinates relative to the position of the formation. I feel I may be doing something dubious here by calculating each axis value individually, and/or that these simple algebraic equations are not the correct type of math to attempt to solve this problem with.
I started with trying to create some basic formations with this structure and either lack the experience in using equations to create specific groups of 3D coordinates or have just completely gone down the wrong path.
The three initial ones(that I failed to create on all three) were:
10x10x1 square formation
10x10x10 cube formation
Hollow spherical formation
Hollow elipsoid formation
First two were just for proof of concept, third and fourth would be for an actual gameplay mechanic of surrounding and capturing enemy resources.
Any advice from pointing me in the right direction to examples to build off of would be greatly appreciated.
Thanks!
submitted by HumbledIdiot to askmath [link] [comments]


2024.04.28 22:46 Chance-Worth-4954 Little story about how it all started..

Little story about how it all started..
A bit of history..
In the 80s and 90s, we have satellite dishes with pirate cards, it worked perfectly, but at that time we didn't have any problems with buffering. ;>)) But there were also people like myself with my own subscriptions to both Viasat & Canal Digital that I paid for. At that time it cost EUR 90 per month in today's currency. If you had the knowledge, you could read the original card and then add another one that you could then sell. The problem then was that anyone who had a little knowledge of how it worked with card readers could then in turn make a copy of what he bought. At that time there was no control over how many users were used. I made my own software so that no one could make a copy of my cards, when it was put in a card reader the card was erased, except for a small "string" where I could see if any attempt to read the card was made...
The smartcard used by DirecTV after the “H card” became known by pirates as the “HU” card. This was the third smartcard series for DSS (P3 card - period 3). Sometimes referred to as “the football card” because of the artwork on the back of the card.’


Oddly enough, the “football card” was not the one OJ Simpson was busted for pirating. That was the previous generation of smartcard, the “H card”.

https://preview.redd.it/smlkhrx76axc1.jpg?width=680&format=pjpg&auto=webp&s=8b037b1d67788d32286def83b97d042b3e38f825
By 2001, “HU loader” pirate smartcard programming software hit the market even before the hacked “H card” swap-out was done. Some pirates still used hacked H cards while others were buying new pirate HU cards that would last through the impending complete shutdown of the H card.
For prices in the tens of thousands of dollars, dealers bought HU loaders - slightly modified WildThing glitchers, with new software for the new smartcard. The pirate customer base had grown a lot from the previous hacked cards, so there was plenty of demand for the HU hack.
Once the HU loader was in the hands of pirates, it wasn’t long before it was cloned. More sources for HU loaders meant competitive prices. Eventually this led to the secret firmware for the HU loader (the “Atmel code”, for the AT90S2313 chip) being posted on a public website.
After the HU loader Atmel code was posted on the “HackHU” website, anybody with the right kind of smartcard programmer could program their own pirate HU cards. Some pirates modified the WildThing unlooper they bought for the H card to be compatible with the new HU loader.
In the early 2000s, detailed technical information about new hacks, schematics and software, was being spread via the Internet at an accelerated rate. It was no longer a trickle of information from high level dealers filtering down to end users.
A pirate might buy an expensive unlooper or loader, think “I could sell a lot of these”, reverse engineer the PCB and draw a schematic, then a locked chip becomes a dead end. So they post schematics online and hope someone else shares the missing pieces.
Many hardware producers (like Mikobu) competed to build and sell ready-to-use HU loaders. By 2002, for under a hundred dollars anybody could get everything they needed to program pirate smartcards. It was never easier to become a DirecTV pirate, or even a pirate card dealer.

https://preview.redd.it/pad3rdwc6axc1.jpg?width=1200&format=pjpg&auto=webp&s=015bb714ef4b0423a65fdf869403e8aca1eaa5df
This is another example, a Mikobu III loadeunlooper. The Mikobu III may have been the top selling pirate card programmer in its time, it was considered the “gold standard” to many pirates.

https://preview.redd.it/qsk44k6h6axc1.jpg?width=680&format=pjpg&auto=webp&s=3a499a9cc93a2a8bd5467351fc16d9a47768b45a
In 2001, the pirates had to deal with something they recognized all too well by now - their hacked smartcards were looped by a countermeasure against them. A lot of people had gotten used to programming their own smartcards, now they had to hope somebody produced an HU unlooper.
As was typical, the first HU unlooper was in the hands of a commercial pirate, unlooping cards for a fee. Later on, though, things took a turn - the growing mass of hackers chatting and sharing information online led to a new kind of development.
Not just one, but two kinds of HU unlooper were released and shared freely on the internet within days of each other in March 2002. HUFF (to unloop the HU cards responding only FF due to looping) and ul4s (unloop for sure). They both worked great.

https://preview.redd.it/axc4rddl6axc1.jpg?width=1200&format=pjpg&auto=webp&s=add5fa4f2cbc84a72e9d9a8642f52ed77f25d346

https://preview.redd.it/lb4d86gm6axc1.jpg?width=1200&format=pjpg&auto=webp&s=ba55d97148c8ce6e91a420c0ccb16ddd4e14ef04
By this time there was a kind of “critical mass” of hackers online, developing and sharing hacks for the HU card. Web forums, usually vBulletin or phpBB, were numerous and very active. Pirates helped each other use hacks and searched for new fixes when their cards were shut down.
Popular forums of the time included The Pirates Den (dsschat .com), Interesting Devices (id-discussions), Innermatrix, DR7 .com, Hitec Sat, and dozens of others. Many forums were owned by pirate dealers, and some acted as administrators or moderators on others.
Forums weren’t only for end-user pirates, some of the hackers developing new fixes frequented them also. Over time there was a shift from hacks developed in private and sold through a dealer network, to more widespread, and sometimes free, direct sharing over the Internet.
Some of the hackers became known and respected by many other pirates who admired the skills required to create new and improved hacks. Names like no1b4me, aol6945, and RAM9999 were amongst the elite in the online forums.

https://preview.redd.it/h0wqaetq6axc1.jpg?width=1200&format=pjpg&auto=webp&s=faa66dca9cf2f39a868fbd68776b9795e7231cba
Some of these hackers’ names would later show up on a DirecTV website publicizing the many legal actions taken against pirates. DirecTV would proudly boast over 24,000 lawsuits against end users, in addition to action against the hackers and dealers.
Satellite TV piracy was big business. In 1998, Canadian pirate Reggie Scullion (V-Cipher) was raided by police and $4 million in cash, bonds, and bank drafts was seized from his home and business, along with over 10,000 DirecTV smartcards. Pirate business was booming.
By 2003, satellite TV piracy seemed to have been growing out of control for years, despite increasing legal actions against pirates. There were estimates in the press of over 3 million satellite TV pirates, resulting in pay TV companies losing $4 billion in revenue.
Most end-user pirates were paying someone to program, or to unloop, their smartcard. The dealers they paid came in all shapes and sizes - friends, family, small businesses, international pirate dealer networks.
Hackers were writing their own versions of 3M code (or ripping off someone else’s), packaging it up in a loader program with a Windows GUI, or in some cases a script for the popular Winexplorer tool, and offering it for sale.
Pirate software evolved alongside changes in computers and the Internet. From MS-DOS text interfaces to simple Windows applications with two or three buttons, to integrated environments for developing and testing hacks.
For the H card, a software package “BasicH” was been popular for being powerful but also relatively simple to use. Later, with the HU card, even more tools like “ExtremeHU” for programming cards and “HU Sandbox” for developing new pirate code became standard tools for hackers.

https://preview.redd.it/07rj9fiw6axc1.jpg?width=766&format=pjpg&auto=webp&s=11baf6c8dcea5fc0b124bde1fb7924a35763b58b
WinExplorer was a widely used tool for smartcard hackers, for writing “.XVB files”, VBScript programs that talked to smartcards or smartcard glitchers. Hackers would share, study, and modify an overwhelming number of .XVB files, a de facto standard for pirate satellite scripts.

https://preview.redd.it/lu7krltz6axc1.jpg?width=760&format=pjpg&auto=webp&s=0943808106bba39bcf9f3f0ee7f82adaa612812f
Another way used by pirates to watch DirecTV without paying was to use an “emu”, or emulator, system. Using a computer connected to a satellite receiver by a simple PCB interface, the smartcard was emulated to allow all channels to be viewed.
Because hackers hadn’t reverse engineered the HU card ASIC, they could only emulate the microcontroller functions while using a real HU card as an “AUX card”, sending data through the hardware ASIC. The emu software protected the original smartcards from ECMs.
After each version of DirecTV (or later, Dish Network) smartcard was hacked, versions of emulator hacks were developed by pirates. SLE44 and Pitou were H card emulators, Kryptonite an HU emu.

https://preview.redd.it/4mcu5vo37axc1.jpg?width=284&format=pjpg&auto=webp&s=f5779dc53fe15b5fc7100e9c66c3b616164c4178
During the era of the pirate HU card, DirecTV may have had a hard time locking out the pirates with glitchers from programming their smartcards, but there was a lot of grief for the pirates in the form of frequent countermeasures shutting down the hacked cards temporarily.
A technique used by the HU card against the pirates was “dynamic code”. Instead of key calculation being done entirely in the ROM and EEPROM firmware known to the hackers, short blocks of program code was sent down in real-time over the satellite.
Dynamic code meant the pirate hackers had to chase a moving target. The code could be changed at any time, and different versions of dynamic code could be rotated in and out of service at any time.
Sending program code over the satellite had been done before, but on the HU card the dynamic code went beyond what pirates had previously dealt with. Let the code run and risk the card being looped. Block unknown code and the card is shut down every few days.
Another electronic countermeasure (ECM) used to target pirate H and HU cards was known as “hashing”. Regions of the smartcard’s memory would be used in the key calculation algorithm, so that the correct decryption keys depended on correct (not hacked) data in the card.
The pirate hackers rose to the challenge, improving their hacks, adding “stealth” and “AI” features to their 3M code. The level of artificial intelligence implemented in these 4 MHz micros with a total 384 bytes of RAM and a KB or two of EEPROM space seems suspect, in retrospect.
Unlike the H card with two chips (microcontroller and ASIC), the HU card only had one chip inside. The HU card had an ASIC on the same silicon die as a Texas Instruments TMS370 microcontroller. 384 bytes of RAM, 16 KB ROM, 8 KB EEPROM.

https://preview.redd.it/m5pby1a87axc1.jpg?width=1200&format=pjpg&auto=webp&s=e4cb7cb084b5a5604e004c6432187753d6bc6f80
Another tactic used by DirecTV to shut down pirate cards was to more aggressively target smartcards that were inactive or had been deactivated in their system. The satellite receiver itself could be disabled, instead of relying on software inside the smartcard to disable itself.
The pirates became familiar with the dreaded “Call Ext. 745” messages when their card ID# was disabled - blacklisted by DirecTV. Pirates cloned cards with different ID# or modded the satellite receiver software, and new hacks were developed to bypass the Ext745 shutdowns.
One type of hack was a “no745 board” that acted as a wedge between the satellite receiver and smartcard. The no745 board exploited a bug in the receivers, allowing it to provide the receiver with a fake ID#, to avoid any blacklisting by DirecTV.
Pirates continued to hack the HU card to watch DirecTV for free, though sometimes having to update their pirate card daily, until the HU smartcard was swapped out to a new “P4” card. The P4 card had increased the level of complexity again to be better protected against hackers.
Support for the HU smartcards was disabled permanently by DirecTV in mid-2004. Since then, there has not been a pirate DirecTV hack on the market, or published online. The new P4 smartcards locked the pirates out.

https://preview.redd.it/g2oiemxe7axc1.jpg?width=1200&format=pjpg&auto=webp&s=c9872b8a955e71e87b6688a55e24e218e7e66a11
Around that time, a lot of pirates had switched their focus to Dish Network (and Bell Expressvu in Canada), which had been hacked since 1999. That’s another topic with many different stories to be told.
In September 2006 at 27 years old, the hacker behind the HU loader, the first pirate hack for the DirecTV HU smartcard, pled guilty to charges carrying up to 5 years in prison. At the time, he was already serving a 30 month prison sentence for hacking DirecTV Latin America.
In the years since, he and the other hackers involved in DirecTV piracy in the 90s/early 2000s have completed their prison sentences, been released, and put satellite TV piracy in the past.
Then we all had different cards and modules....
Here are some examples that many people probably recognize...'
https://preview.redd.it/a1wziizx7axc1.jpg?width=680&format=pjpg&auto=webp&s=10309493a65c92d523b18f49c62c673b8c2d9859

https://preview.redd.it/1pq6956n7axc1.jpg?width=1200&format=pjpg&auto=webp&s=b874c1fc4b100a30064dda1d255013ccc5196594

https://preview.redd.it/2y580dep7axc1.jpg?width=1000&format=pjpg&auto=webp&s=677199c3d1a2c705abb79f4217062e5c1956e718

https://preview.redd.it/5sbfv7ar7axc1.jpg?width=1600&format=pjpg&auto=webp&s=1ee313f5b5233e3ca0b417b51f785cfd4c667432

https://preview.redd.it/8fe623vs7axc1.jpg?width=925&format=pjpg&auto=webp&s=9c00f5e3488ec8bb6e4c67bc079fd477ab943bcc

https://i.redd.it/56phv56u7axc1.gif
A little history for those of you who weren't there at the time, when we had to do everything ourselves. Now we have IPTV which is very easy compared to the time before.
Next entry, is an IPTV Guide. A jungle for most of them, even those who call themselves Resellers barely know what they are talking about when they advertise.. Most of them do NOT know what it is they have and write what they think, so BEWARE!!
submitted by Chance-Worth-4954 to IPTVAdviceAndTip [link] [comments]


2024.04.28 22:44 Virtual-Grade592 [A4A] [F4A] [M4A] [script offer] First magic lesson with your lich partner [fantasy] [magic] [magical lessons] [lich] [part 6]

This script is part of my lich partner series. You can find the other parts in my masterlist: My masterlist : u/Virtual-Grade592 (reddit.com)
I put the script in scriptbin for ease of recording (I heard some VA's prefer reading it from there): Virtual-Grade592: [A4A] [F4A] [M4A] It turns out your partner is a lich [fantasy] [magic] [undead] - scriptbin
It's okay to fill this script and make minor adjustments. Please give me credit for writing the script and put a link in the comments so that I can find your audio. It's okay to paywall, but send me a copy of the audio then.

(The listener begins their first lesson in learning magic with their partner. For some reason the speaker brought the listener to the middle of a desert)
Aahhh here we are. The perfect place to practice some magic.
[pause]
Why are we in the desert? Because there is nothing out here. There are no people who could spot us doing magic. There is no collateral damage you could cause here. And most importantly, there is peace and quiet. The perfect environment to concentrate.
[pause]
Yeah, this is basically our playground. Here you are free to experiment with magic to your heart’s content. Your magic lessons will take place here.
[pause]
*sarcastic* Aw is the sand too rough for you? Do you find it uncomfortable standing on all this sand? If only there was some mystical power that could help.
[pause]
*teasing* No honey, I won’t do it for you. You’ll have to do it on your own. How else are you going to learn to cast magic?
[pause]
Sweetheart, a demonstration is hardly going to help. I doubt you’d learn anything by just watching me cast magic. You would only see the effect it has, not the process of the magic itself. I’ll show you why it wouldn’t work.
[crack of thunder]
There, a bolt of lightning leapt from my hand. After my amazing demonstration, can you repeat it?
[pause]
No? That’s what I thought. It’s not easy to cast magic honey. You can’t just wave your hand and it happens. It’s like learning to walk or talk. Right now you are stumbling and blabbering, but with time and guidance you can run. So let us begin with a small step at first. The sand is too hot right? It’s not quite a comfortable temperature?
[pause]
Yeah I’ll teach you how to cool it down. First visualise what you want to happen. Imagine the sand getting cool, like when night falls in the desert. The heat seeps away and the grains of sand get nice and cool. Now picture which part of the sand gets cold. Look at your meet. It’s the sand beneath you that you want to cool.
[pause]
Okay, keep that image in your mind. Stay focussed on the cold sand. Now breathe in. *speaker breathes in as well.* Feel the magical energy build up in the back of your mind. Now let the energy flow through your mind, absorbing your image of cold sand. And finally breathe out, letting the magic leave your mind. *speaker breathes out as well.* That’s it, it’s working.
[pause]
*laughing hysterically* Honey, you certainly created something magical. I just didn’t expect you to turn the sand blue. *teasing* Just a bit more practice and you’ll be the greatest painter known to man.
[pause]
*speaker tries to hold their laughter in.* Okay, okay sweetie, I’ll be serious. I won’t laugh, even though you look ridiculous standing on blue sand.
[pause]
*gradually getting serious again.* What went wrong? The frequency of the spell changed. When the magical energy moves through your brain, you change it to a specific frequency, so that it will do what you want it to do. This is why you need to keep imagining what you want to happen. Your brain changes the energy to the frequency of the image you focus on. I think you got distracted or associated cold with blue for a moment. That changed the spell from cooling the sand to changing it’s colour.
[pause]
*uplifting* Oh sweetheart, don’t be discouraged. Magic is genuinely difficult. You already did great by creating any magical effect. Usually when a magician doesn’t have a clear enough image in their mind, because they get distracted for example, then the spell doesn’t work. The frequency is too messed up to do anything. But you still made it clear enough to have an effect. So you did well for your first time. Just keep trying and you’ll get the hang of it.
[pause]
Yeah I mean it. For a first attempt it was good. And I know you’ll get better if you keep trying. So please give it another go.
[pause]
Good honey, picture what will happen in your mind, breathe in, let the magic float in your skull and breathe out. Let the magic float out.
[pause]
Nothing happened? That’s okay. When I try to learn a new spell, it can take hours or even days before I get it right. It’s frustrating trying over and over again with no results, but that is just part of being a magician. This is simply part of the learning process.
[pause]
You want to try again? Wonderful honey, go ahead.
[pause]
Oh you want me to be quiet? Okay, I won’t disturb your concentration.
[longer pause]
Sweetie, I know it’s frustrating that nothing happened again. Just relax.
[pause]
*reassuring* No, you’re not doing anything wrong. It simply takes practice. But I can see you’re getting frustrated, so I think a break is in order. Let’s get your mind off failure. And I have the perfect way to do it.
[the speaker picks the listener up.]
Here you go, you’re snugly in my arms. Now let me show you how amazing magic can be.
[sound of wind as the speaker begins to fly.]
Yeah, you’re not seeing things, we’re really flying. I know this has always been your favourite superpower. See that tiny blue spot down there? That’s where we stood a few moments ago.
[pause]
It’s amazing isn’t it? Being so high up? Feeling the wind in your hair? All this is possible with a little magic. And you can learn this as well. For now it’s too difficult for you, but in a year or two you’re ready for this.
[pause]
*jokingly* You can’t wait for it? Birds learn to fly by jumping off a tree. Would you learn it if I let you go now?
[pause]
*laughing* I’m kidding. You’re safe with me darling. I’d never let you fall. And when you’re down, I’ll pick you back up.
[pause]
You’re feeling better? You’re ready to get back down and try the spell again? Okay, I’ll fly us down.
[wind noises as the speaker flies down and lands on the ground]
Here we are, back on terra firma. Take a moment to breathe. Relax and feel calm. When you feel ready, then begin casting the spell.
[longer pause]
*surprised* Huh? Honey why did you jump back?
[pause]
It’s gotten too cold? Hang on let me feel the sand.
[speaker pulls hand back in shock]
Oof that’s freezing. Congrats sweetheart, you certainly nailed it. In fact you did a bit too well. You’ve made it far too cold for comfort. But that’s a minor problem. You’ll get more self-control over the temperature the more you practice the spell. We can work on that another time, now I think we should celebrate your first success as a mage.
[pause]
*fake pondering* How to celebrate? I wonder. Perhaps like this?
[snaps fingers and fireworks go off.]
Congratulations my dear. You’re officially a spellcaster. And in my opinion you are the most handsome mage in the world. How do you feel?
[pause]
Tired and hungry? That’s understandable. Your first time with magic is intense. Now that you’ve succeeded your first spell, I think we can call it a day. I’ll teleport us back to town and we can enjoy our dinner in peace. Here, hold my hand.
[teleportation noises]
submitted by Virtual-Grade592 to ASMRScriptHaven [link] [comments]


2024.04.28 22:37 Titty_Slicer_5000 Tensorflow Strided Slice Error. Need help.

TLDR at the bottom
My Full Tensorflow Code: Link. Please excuse all the different commented out parts of code, I've had a long road of trouble shooting this code.
Hardware and Software Setup
-Virtual Machine on Runpod
-NVIDIA A100 GPU
-Tensorflow 2.15
-CUDA 12.2
-cuDNN 8.9
What I'm doing and the issue I'm facing
I am trying to creating a visual generator AI, and to that end I am trying to implement the TGANv2 architecture in Tensorflow. The TGANv2 model I am following was originally written in Chainer by some researchers. I also implemented it in Pytorch (here is my PyTorch code if you are interested) and also ran it in Chainer. It works fine in both. But when I try to implement it in Tensorflow I start running into this error:
Traceback (most recent call last): File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/ops/script_ops.py", line 270, in __call__ ret = func(*args) ^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/autograph/impl/api.py", line 643, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/data/ops/from_generator_op.py", line 198, in generator_py_func values = next(generator_state.get_iterator(iterator_id)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 140, in __iter__ yield self[idx] ~~~~^^^^^ File "/workspace/3TF-TGANv2.py", line 126, in __getitem__ x2 = self.sub_sample(x1) ^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 99, in sub_sample x = tf.strided_slice(x, begin, end, strides) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler raise e.with_traceback(filtered_tb) from None File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/eageexecute.py", line 59, in quick_execute except TypeError as e: tensorflow.python.framework.errors_impl.InvalidArgumentError: {{function_node __wrapped__StridedSlice_device_/job:localhost/replica:0/task:0/device:GPU:0}} Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice] 
What's important to note about this issue is that it does not come up right away. It can go through dozens of batches before this issue pops up. This error was generated with a batch size of 16, but if I lower my batch size to 8 I can even get it to run for 5 epochs (longest I've tried). The outputs of the Generator are not what I saw with Chainer or Pytorch after 5 epochs (it's mostly just videos of a giant black blob), though I am unsure if this is related to the issue. So with a batch size of 8 sometimes the issue comes up and sometimes it doesn't. If I lower the batch size to 4, the issue almost never comes up. The fact that this is batch size driven really perplexes me. I've tried it with multiple different GPUs.
Description of relevant parts of model and code
The way the Generator works is as follows. There is a CLSTM layer that generates 16 features maps that have a 4x4 resolution and 1024 channels each. Each feature map corresponds to a frame of the output video (the output video has 16 frames and runs at 8fps, so it's a 2 second long gif).
During inference each feature map passes through 6 upsampling blocks, with each upsampling block doubling the resolution and halving the channels. So after 6 blocks the shape of each frame is (256, 256, 16), so it has a 256p resolution and 16 channels. Each frame then gets rendered by a rendering block to render it into a 3-channel image, of shape (256, 256, 3). So the final shape of the output video is (16, 256, 256, 3) = (T, H, W, C), where T is the number of frame, H is the height, W the width, and C the number of channels. This output is a single tensor.
During training the setup is a bit different. The generated output video will be split up into 4 "sub-videos", each of varying resolution and frames. This will output a tuple of tensors: (tensor1, tensor2, tensor3, tensor4). The shapes of each tensor (after going through a rendering block to reduce the channel length to 3)) is tensor1=(16, 32, 32, 3), tensor2=(8, 64, 64, 3), tensor3=(4, 128, 128, 3), tensor4=(2, 256, 256, 3). As you can see, as you go from tensor1 to tensor4 the frame number gets halved each time while the resolution doubles. The real video examples also get split up into 4 sub-video tensors of the same shape. These sub-videos are what are fed into the discriminator. Now the functionality that halves the frame length is called sub-sampling. How the function works is that it starts at either the first or second frame (this is supposed to be random) and then selects every other frame. There is a sub-sample function in both the Videodataset class (which takes the real videos and generates 4 sub-video tensors) and in the Generator class. The Videodataset class outputs 4-D tensors (T, H, W, C), while the Generator class outputs 5 because it has a batch dimension N.
This is the sub-sample function in the VideoDataset class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x 
This is the sub-sample function in the Generator class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 # begin = [0, offset, 0, 0, 0] # start from index 'offset' in the second dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3], original_shape[4]] strides = [1, frame, 1, 1, 1] # step 'frame' in the second dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[1]) // frame #print(f"Gen Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[1]}") if x.shape[1] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[1]}") return x 
You'll notice I am using tf.strided_slice(). I originally tried slicing/sub-sampling using the same notation you would do for slicing a numpy array: x = x[:,offset::frame,:,:,:]. I changed it because I thought maybe that was causing some sort of issue.
Below is a block diagram of the Generator and VideoDataset (labeled "Dataset" in the block diagram) functionalities.
https://preview.redd.it/2vh7yx2g09xc1.png?width=1862&format=png&auto=webp&s=143d5c4c8df91fc71b9da1d3858feaae28c4605a
A point of note about the block diagram, the outputs of Dataset are NOT combined with the outputs of the Generator, as might be mistakenly deduced based on the drawing. The discriminator outputs predictions on the Generator outputs and the Dataset outputs separately.
I don't think this issue is happening in the backward pass because I put in a bunch of print statements and based on those print statements the error does not occur in the middle of a gradient calculation or backward pass.
My Dataloader and VideoDataset class
Below is how I am actually fetching data from my VideoDataset class:
 #Create dataloader dataset = VideoDataset(directory) dataloader = tf.data.Dataset.from_generator( lambda: iter(dataset), # Corrected to use iter() to clearly return an iterator from the dataset output_signature=( tf.TensorSpec(shape=(16, 32, 32, 3), dtype=tf.float32), tf.TensorSpec(shape=(8, 64, 64, 3), dtype=tf.float32), tf.TensorSpec(shape=(4, 128, 128, 3), dtype=tf.float32), tf.TensorSpec(shape=(2, 256, 256, 3), dtype=tf.float32) ) ).batch(batch_size) 
and here is my VideoDataset class:
class VideoDataset(): def __init__(self, directory, fraction=0.2, sub_sample_rate=2): print("Initializing VD") = directory self.fraction = fraction self.sub_sample_rate = sub_sample_rate all_files = [os.path.join(self.directory, file) for file in os.listdir(self.directory)] valid_files = [] for file in all_files: try: # Read the serialized tensor from file serialized_tensor = tf.io.read_file(file) # Deserialize the tensor tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) # Adjust dtype if necessary # Validate the shape of the tensor if tensor.shape == (16, 256, 256, 3): valid_files.append(file) except Exception as e: print(f"Error loading file {file}: {e}") # Randomly select a fraction of the valid files selected_file_count = int(len(valid_files) * fraction) print(f"Selected {selected_file_count} files") self.files = random.sample(valid_files, selected_file_count) def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x def pooling(self, x, ksize): if ksize == 1: return x T, H, W, C = x.shape Hd = H // ksize Wd = W // ksize # Reshape the tensor to merge the spatial dimensions into the pooling blocks x_reshaped = tf.reshape(x, (T, Hd, ksize, Wd, ksize, C)) # Take the mean across the dimensions 3 and 5, which are the spatial dimensions within each block pooled_x = tf.reduce_mean(x_reshaped, axis=[2, 4]) return pooled_x def __len__(self): return len(self.files) def __getitem__(self, idx): #print("Calling VD getitem method") serialized_tensor = tf.io.read_file(self.files[idx]) video_tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) x1 = video_tensor x2 = self.sub_sample(x1) x3 = self.sub_sample(x2) x4 = self.sub_sample(x3) #print("\n") x1 = self.pooling(x1, 8) x2 = self.pooling(x2, 4) x3 = self.pooling(x3, 2) #print(f"Shapes of VD output = {x1.shape}, {x2.shape}, {x3.shape}, {x4.shape}") return (x1, x2, x3, x4) def __iter__(self): print(f"Calling VD iter method, len self = {len(self)}") #Make the dataset iterable, allowing it to be used directly with tf.data.Dataset.from_generator. for idx in range(len(self)): yield self[idx]self.directory 
The issue is happening at one point when the dataloader is fetching examples from Videodataset in my opinion, I just can't figure out what is causing it.
TLDR
I am using a runpod VM with an NVIDIA A100 GPU. I am trying to train a GAN that outputs 2 second long gifs that are made up fo 16 frames. One of the training step involves splitting the output video (either real or fake) into 4 sub videos of different frame length and resolution. The reduction of frames is achieve by a sub-sample function (which you can find earlier in my post, it is bolded) that starts at the first or second frame of the video (random) and then selects every other frame, so it halves the frames. So I am essentially doing a strided slice on a tensor, and I am using tf.strided_slice(). I tried using regular slicing notation (like you would use in NumPy), and I get the same error. The weird thing about this is that the issue does NOT come up immediately in training and is dependent on batch size. The training goes through several batch iterations just fine (and sometimes some epochs) with a batch size of 16. If I lower the batch size to 8 it's absle to go thorugh even more iterations, even up to 5 epochs (I didn't test it for longer), although the outputs are not the outputs I would expect after some epochs (I expect a specific type of noisy image based on how this model ran in PyTorch and Chainer frameworks, but I instead get a video that's mostly just a black blob through most of the resolution, just a bit of color on the edges). If I go down to a batch size of 4 the issue goes away mostly. See below for the error I am seeing:
Error:
Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice]
submitted by Titty_Slicer_5000 to MLQuestions [link] [comments]


2024.04.28 22:37 Hajeia NEED ADVICE: Gem Stalker Redesign for 2nd Level Party

Hi everyone,
I am unsure how and if I can add images - so sry if there are none to better understand what I'm talking about. I am happy to share the images of the map and stat block in some way if I'm told how :D

TL;DR

I find it really hard to create balanced combat encounters, especially those that are challenging but doable and in the end rewarding (like with boss fights). Therefore I need help adjusting the difficulty of my final boss and the fight for a rather new group of Level 2 players. I used a self-made homebrew version of a GEM STALKER with some adjusted Stats and Lair Actions (s. below). Do you think that is balanced, also given the terrain? Or would you change something to make it fairer?

PARTY

I am currently planning out the final battle for a Homebrew One Shot Adventure for 1st time players and need help with balancing the final boss for my party (haven't done a lot of homebrew monsters so far). It will be their second session but they're doing very well, in terms of understanding the rules, using their abilities and with roleplaying (even in combat).
We're using some simplified rules, the amazing DnD Story Mode made by u/joelesko , which I adjusted myself in terms of adding some more races, an additional feature specific to a kind of subclass and making the druid and ranger class a bit more unique, nothing too major though, that should impact the balance of the rule set too much (at least I hope so xD so far it felt quite balanced). Here's the Class Reference Sheets and Character Creation Guide I made for my players if anyone wants to have a look and have a reference how the characters were build. (Unfortunately my guides are all in German as my players don't speak English very well but maybe you can still gleam the most important information from it as many features are kept in English (for simplicity of referencing) - sry about that)
The party consists of 5 players, each Level 2:
I was thinking of maybe leveling them up to Level 3 before the boss fight to make them a bit stronger and more resistant, but am unsure if that's the way to go as they already leveled up once during the last session (after their first fight and completing the first part of the adventure) and I don't want to overwhelm them with options. Or should I just give them like a boost in abilities, I'd have a way to make that plausible in game (same for the level up)

MONSTER: GEM STALKER

Generally they will be fighting a cursed amethyst dragon (called Belayana) with a stat block based on that of an Amethyst Gem Stalker, that is trying to protect a magic tree that gives the surrounding forest and the beings within life energy. They can either kill it (and let the forest die but get the "Quest" money), knock it unconscious and help Belayana to get back to her true Dragon form, by uniting her body with that of the gem tree, or (with a lot of clever thinking and luck) might be able to persuade it to stand down and find a different way.
Belayana will be rather aggressive when fighting as she sees everyone in the cave as a danger to the tree and the forest, and is intend on protecting it no matter what. She will use walls and water and her teleport to her advantage to attack and move around. Although she is turned into a monster, she is still intelligent and acts like it but will fight to the death.
I adjusted the Stat Block of the Gem Stalker with the 5etools integrated CR adjustments in terms of damage and abilities to a CR 2. Below is everything I changed or added (indicated by a +)
I was thinking of either modifying a mephit (which one though?) to a gem flavour to use as a minion AND/OR use the Small Earth Elemental (adjusted to only one attack per turn) created by u/Kankerata. I usually like using minions, because, imo, then the fight gains a better action economy instead of just circling and pummeling the big bad.
MAP
For the sake of completion and better calculate what difficulty is appropriate:
I used this map by u/FantastiskDoD as a general base and adjusted it to my amethyst setting for Foundry VTT. The map is ~190ft in length and 160ft top to bottom (5ft grid). There are two bodies of water (left and right) and some of the crystals on the ground can be used to hide behind. The big gemstones inside the right lake is a stand-in for the big magical gem tree, that fuels the magic of the forest and of Belayana. There are Stone slips inside the left lake that can be traversed by jumping and if necessary an Athletics Check, but are considered difficult terrain as they are wet and slippery. A hidden entrance through a waterfall is the primary way into the cave. The ground inbetween the lakes is about 30ft wide at its smalles part.

Thanks in advance for any help and ideas :D I am happy to answer any questions that might arise or post this somewhere else, where it is more fitting - I am aware it's a lot of information and not the best presented. I just don't know what would be important in that regard >.<
submitted by Hajeia to DMAcademy [link] [comments]


2024.04.28 22:37 Titty_Slicer_5000 Tensorflow Strided Slice Error. Need help.

Tensorflow Strided Slice Error. Need help.
TLDR at the bottom
My Full Tensorflow Code: Link. Please excuse all the different commented out parts of code, I've had a long road of trouble shooting this code.
Hardware and Software Setup
-Virtual Machine on Runpod
-NVIDIA A100 GPU
-Tensorflow 2.15
-CUDA 12.2
-cuDNN 8.9
What I'm doing and the issue I'm facing
I am trying to creating a visual generator AI, and to that end I am trying to implement the TGANv2 architecture in Tensorflow. The TGANv2 model I am following was originally written in Chainer by some researchers. I also implemented it in Pytorch (here is my PyTorch code if you are interested) and also ran it in Chainer. It works fine in both. But when I try to implement it in Tensorflow I start running into this error:
Traceback (most recent call last): File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/ops/script_ops.py", line 270, in __call__ ret = func(*args) ^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/autograph/impl/api.py", line 643, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/data/ops/from_generator_op.py", line 198, in generator_py_func values = next(generator_state.get_iterator(iterator_id)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 140, in __iter__ yield self[idx] ~~~~^^^^^ File "/workspace/3TF-TGANv2.py", line 126, in __getitem__ x2 = self.sub_sample(x1) ^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 99, in sub_sample x = tf.strided_slice(x, begin, end, strides) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler raise e.with_traceback(filtered_tb) from None File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/eageexecute.py", line 59, in quick_execute except TypeError as e: tensorflow.python.framework.errors_impl.InvalidArgumentError: {{function_node __wrapped__StridedSlice_device_/job:localhost/replica:0/task:0/device:GPU:0}} Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice] 
What's important to note about this issue is that it does not come up right away. It can go through dozens of batches before this issue pops up. This error was generated with a batch size of 16, but if I lower my batch size to 8 I can even get it to run for 5 epochs (longest I've tried). The outputs of the Generator are not what I saw with Chainer or Pytorch after 5 epochs (it's mostly just videos of a giant black blob), though I am unsure if this is related to the issue. So with a batch size of 8 sometimes the issue comes up and sometimes it doesn't. If I lower the batch size to 4, the issue almost never comes up. The fact that this is batch size driven really perplexes me. I've tried it with multiple different GPUs.
Description of relevant parts of model and code
The way the Generator works is as follows. There is a CLSTM layer that generates 16 features maps that have a 4x4 resolution and 1024 channels each. Each feature map corresponds to a frame of the output video (the output video has 16 frames and runs at 8fps, so it's a 2 second long gif).
During inference each feature map passes through 6 upsampling blocks, with each upsampling block doubling the resolution and halving the channels. So after 6 blocks the shape of each frame is (256, 256, 16), so it has a 256p resolution and 16 channels. Each frame then gets rendered by a rendering block to render it into a 3-channel image, of shape (256, 256, 3). So the final shape of the output video is (16, 256, 256, 3) = (T, H, W, C), where T is the number of frame, H is the height, W the width, and C the number of channels. This output is a single tensor.
During training the setup is a bit different. The generated output video will be split up into 4 "sub-videos", each of varying resolution and frames. This will output a tuple of tensors: (tensor1, tensor2, tensor3, tensor4). The shapes of each tensor (after going through a rendering block to reduce the channel length to 3)) is tensor1=(16, 32, 32, 3), tensor2=(8, 64, 64, 3), tensor3=(4, 128, 128, 3), tensor4=(2, 256, 256, 3). As you can see, as you go from tensor1 to tensor4 the frame number gets halved each time while the resolution doubles. The real video examples also get split up into 4 sub-video tensors of the same shape. These sub-videos are what are fed into the discriminator. Now the functionality that halves the frame length is called sub-sampling. How the function works is that it starts at either the first or second frame (this is supposed to be random) and then selects every other frame. There is a sub-sample function in both the Videodataset class (which takes the real videos and generates 4 sub-video tensors) and in the Generator class. The Videodataset class outputs 4-D tensors (T, H, W, C), while the Generator class outputs 5 because it has a batch dimension N.
This is the sub-sample function in the VideoDataset class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x 
This is the sub-sample function in the Generator class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 # begin = [0, offset, 0, 0, 0] # start from index 'offset' in the second dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3], original_shape[4]] strides = [1, frame, 1, 1, 1] # step 'frame' in the second dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[1]) // frame #print(f"Gen Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[1]}") if x.shape[1] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[1]}") return x 
You'll notice I am using tf.strided_slice(). I originally tried slicing/sub-sampling using the same notation you would do for slicing a numpy array: x = x[:,offset::frame,:,:,:]. I changed it because I thought maybe that was causing some sort of issue.
Below is a block diagram of the Generator and VideoDataset (labeled "Dataset" in the block diagram) functionalities.
https://preview.redd.it/2vh7yx2g09xc1.png?width=1862&format=png&auto=webp&s=143d5c4c8df91fc71b9da1d3858feaae28c4605a
A point of note about the block diagram, the outputs of Dataset are NOT combined with the outputs of the Generator, as might be mistakenly deduced based on the drawing. The discriminator outputs predictions on the Generator outputs and the Dataset outputs separately.
I don't think this issue is happening in the backward pass because I put in a bunch of print statements and based on those print statements the error does not occur in the middle of a gradient calculation or backward pass.
My Dataloader and VideoDataset class
Below is how I am actually fetching data from my VideoDataset class:
 #Create dataloader dataset = VideoDataset(directory) dataloader = tf.data.Dataset.from_generator( lambda: iter(dataset), # Corrected to use iter() to clearly return an iterator from the dataset output_signature=( tf.TensorSpec(shape=(16, 32, 32, 3), dtype=tf.float32), tf.TensorSpec(shape=(8, 64, 64, 3), dtype=tf.float32), tf.TensorSpec(shape=(4, 128, 128, 3), dtype=tf.float32), tf.TensorSpec(shape=(2, 256, 256, 3), dtype=tf.float32) ) ).batch(batch_size) 
and here is my VideoDataset class:
class VideoDataset(): def __init__(self, directory, fraction=0.2, sub_sample_rate=2): print("Initializing VD") = directory self.fraction = fraction self.sub_sample_rate = sub_sample_rate all_files = [os.path.join(self.directory, file) for file in os.listdir(self.directory)] valid_files = [] for file in all_files: try: # Read the serialized tensor from file serialized_tensor = tf.io.read_file(file) # Deserialize the tensor tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) # Adjust dtype if necessary # Validate the shape of the tensor if tensor.shape == (16, 256, 256, 3): valid_files.append(file) except Exception as e: print(f"Error loading file {file}: {e}") # Randomly select a fraction of the valid files selected_file_count = int(len(valid_files) * fraction) print(f"Selected {selected_file_count} files") self.files = random.sample(valid_files, selected_file_count) def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x def pooling(self, x, ksize): if ksize == 1: return x T, H, W, C = x.shape Hd = H // ksize Wd = W // ksize # Reshape the tensor to merge the spatial dimensions into the pooling blocks x_reshaped = tf.reshape(x, (T, Hd, ksize, Wd, ksize, C)) # Take the mean across the dimensions 3 and 5, which are the spatial dimensions within each block pooled_x = tf.reduce_mean(x_reshaped, axis=[2, 4]) return pooled_x def __len__(self): return len(self.files) def __getitem__(self, idx): #print("Calling VD getitem method") serialized_tensor = tf.io.read_file(self.files[idx]) video_tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) x1 = video_tensor x2 = self.sub_sample(x1) x3 = self.sub_sample(x2) x4 = self.sub_sample(x3) #print("\n") x1 = self.pooling(x1, 8) x2 = self.pooling(x2, 4) x3 = self.pooling(x3, 2) #print(f"Shapes of VD output = {x1.shape}, {x2.shape}, {x3.shape}, {x4.shape}") return (x1, x2, x3, x4) def __iter__(self): print(f"Calling VD iter method, len self = {len(self)}") #Make the dataset iterable, allowing it to be used directly with tf.data.Dataset.from_generator. for idx in range(len(self)): yield self[idx]self.directory 
The issue is happening at one point when the dataloader is fetching examples from Videodataset in my opinion, I just can't figure out what is causing it.
TLDR
I am using a runpod VM with an NVIDIA A100 GPU. I am trying to train a GAN that outputs 2 second long gifs that are made up fo 16 frames. One of the training step involves splitting the output video (either real or fake) into 4 sub videos of different frame length and resolution. The reduction of frames is achieve by a sub-sample function (which you can find earlier in my post, it is bolded) that starts at the first or second frame of the video (random) and then selects every other frame, so it halves the frames. So I am essentially doing a strided slice on a tensor, and I am using tf.strided_slice(). I tried using regular slicing notation (like you would use in NumPy), and I get the same error. The weird thing about this is that the issue does NOT come up immediately in training and is dependent on batch size. The training goes through several batch iterations just fine (and sometimes some epochs) with a batch size of 16. If I lower the batch size to 8 it's absle to go thorugh even more iterations, even up to 5 epochs (I didn't test it for longer), although the outputs are not the outputs I would expect after some epochs (I expect a specific type of noisy image based on how this model ran in PyTorch and Chainer frameworks, but I instead get a video that's mostly just a black blob through most of the resolution, just a bit of color on the edges). If I go down to a batch size of 4 the issue goes away mostly. See below for the error I am seeing:
Error:
Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice]
submitted by Titty_Slicer_5000 to learnmachinelearning [link] [comments]


2024.04.28 22:37 Titty_Slicer_5000 [Project] Tensorflow Strided Slice Error. Need help.

[Project] Tensorflow Strided Slice Error. Need help.
TLDR at the bottom
My Full Tensorflow Code: Link. Please excuse all the different commented out parts of code, I've had a long road of trouble shooting this code.
Hardware and Software Setup
-Virtual Machine on Runpod
-NVIDIA A100 GPU
-Tensorflow 2.15
-CUDA 12.2
-cuDNN 8.9
What I'm doing and the issue I'm facing
I am trying to creating a visual generator AI, and to that end I am trying to implement the TGANv2 architecture in Tensorflow. The TGANv2 model I am following was originally written in Chainer by some researchers. I also implemented it in Pytorch (here is my PyTorch code if you are interested) and also ran it in Chainer. It works fine in both. But when I try to implement it in Tensorflow I start running into this error:
Traceback (most recent call last): File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/ops/script_ops.py", line 270, in __call__ ret = func(*args) ^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/autograph/impl/api.py", line 643, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/data/ops/from_generator_op.py", line 198, in generator_py_func values = next(generator_state.get_iterator(iterator_id)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 140, in __iter__ yield self[idx] ~~~~^^^^^ File "/workspace/3TF-TGANv2.py", line 126, in __getitem__ x2 = self.sub_sample(x1) ^^^^^^^^^^^^^^^^^^^ File "/workspace/3TF-TGANv2.py", line 99, in sub_sample x = tf.strided_slice(x, begin, end, strides) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler raise e.with_traceback(filtered_tb) from None File "/root/anaconda3/envs/tf_gpu/lib/python3.11/site-packages/tensorflow/python/eageexecute.py", line 59, in quick_execute except TypeError as e: tensorflow.python.framework.errors_impl.InvalidArgumentError: {{function_node __wrapped__StridedSlice_device_/job:localhost/replica:0/task:0/device:GPU:0}} Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice] 
What's important to note about this issue is that it does not come up right away. It can go through dozens of batches before this issue pops up. This error was generated with a batch size of 16, but if I lower my batch size to 8 I can even get it to run for 5 epochs (longest I've tried). The outputs of the Generator are not what I saw with Chainer or Pytorch after 5 epochs (it's mostly just videos of a giant black blob), though I am unsure if this is related to the issue. So with a batch size of 8 sometimes the issue comes up and sometimes it doesn't. If I lower the batch size to 4, the issue almost never comes up. The fact that this is batch size driven really perplexes me. I've tried it with multiple different GPUs.
Description of relevant parts of model and code
The way the Generator works is as follows. There is a CLSTM layer that generates 16 features maps that have a 4x4 resolution and 1024 channels each. Each feature map corresponds to a frame of the output video (the output video has 16 frames and runs at 8fps, so it's a 2 second long gif).
During inference each feature map passes through 6 upsampling blocks, with each upsampling block doubling the resolution and halving the channels. So after 6 blocks the shape of each frame is (256, 256, 16), so it has a 256p resolution and 16 channels. Each frame then gets rendered by a rendering block to render it into a 3-channel image, of shape (256, 256, 3). So the final shape of the output video is (16, 256, 256, 3) = (T, H, W, C), where T is the number of frame, H is the height, W the width, and C the number of channels. This output is a single tensor.
During training the setup is a bit different. The generated output video will be split up into 4 "sub-videos", each of varying resolution and frames. This will output a tuple of tensors: (tensor1, tensor2, tensor3, tensor4). The shapes of each tensor (after going through a rendering block to reduce the channel length to 3)) is tensor1=(16, 32, 32, 3), tensor2=(8, 64, 64, 3), tensor3=(4, 128, 128, 3), tensor4=(2, 256, 256, 3). As you can see, as you go from tensor1 to tensor4 the frame number gets halved each time while the resolution doubles. The real video examples also get split up into 4 sub-video tensors of the same shape. These sub-videos are what are fed into the discriminator. Now the functionality that halves the frame length is called sub-sampling. How the function works is that it starts at either the first or second frame (this is supposed to be random) and then selects every other frame. There is a sub-sample function in both the Videodataset class (which takes the real videos and generates 4 sub-video tensors) and in the Generator class. The Videodataset class outputs 4-D tensors (T, H, W, C), while the Generator class outputs 5 because it has a batch dimension N.
This is the sub-sample function in the VideoDataset class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x 
This is the sub-sample function in the Generator class:
 def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 # begin = [0, offset, 0, 0, 0] # start from index 'offset' in the second dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3], original_shape[4]] strides = [1, frame, 1, 1, 1] # step 'frame' in the second dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[1]) // frame #print(f"Gen Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[1]}") if x.shape[1] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[1]}") return x 
You'll notice I am using tf.strided_slice(). I originally tried slicing/sub-sampling using the same notation you would do for slicing a numpy array: x = x[:,offset::frame,:,:,:]. I changed it because I thought maybe that was causing some sort of issue.
Below is a block diagram of the Generator and VideoDataset (labeled "Dataset" in the block diagram) functionalities.
https://preview.redd.it/2vh7yx2g09xc1.png?width=1862&format=png&auto=webp&s=143d5c4c8df91fc71b9da1d3858feaae28c4605a
A point of note about the block diagram, the outputs of Dataset are NOT combined with the outputs of the Generator, as might be mistakenly deduced based on the drawing. The discriminator outputs predictions on the Generator outputs and the Dataset outputs separately.
I don't think this issue is happening in the backward pass because I put in a bunch of print statements and based on those print statements the error does not occur in the middle of a gradient calculation or backward pass.
My Dataloader and VideoDataset class
Below is how I am actually fetching data from my VideoDataset class:
 #Create dataloader dataset = VideoDataset(directory) dataloader = tf.data.Dataset.from_generator( lambda: iter(dataset), # Corrected to use iter() to clearly return an iterator from the dataset output_signature=( tf.TensorSpec(shape=(16, 32, 32, 3), dtype=tf.float32), tf.TensorSpec(shape=(8, 64, 64, 3), dtype=tf.float32), tf.TensorSpec(shape=(4, 128, 128, 3), dtype=tf.float32), tf.TensorSpec(shape=(2, 256, 256, 3), dtype=tf.float32) ) ).batch(batch_size) 
and here is my VideoDataset class:
class VideoDataset(): def __init__(self, directory, fraction=0.2, sub_sample_rate=2): print("Initializing VD") self.directory = directory self.fraction = fraction self.sub_sample_rate = sub_sample_rate all_files = [os.path.join(self.directory, file) for file in os.listdir(self.directory)] valid_files = [] for file in all_files: try: # Read the serialized tensor from file serialized_tensor = tf.io.read_file(file) # Deserialize the tensor tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) # Adjust dtype if necessary # Validate the shape of the tensor if tensor.shape == (16, 256, 256, 3): valid_files.append(file) except Exception as e: print(f"Error loading file {file}: {e}") # Randomly select a fraction of the valid files selected_file_count = int(len(valid_files) * fraction) print(f"Selected {selected_file_count} files") self.files = random.sample(valid_files, selected_file_count) def sub_sample(self, x, frame=2): original_shape = x.shape # Logging original shape offset = 0 begin = [offset, 0, 0, 0] # start from index 'offset' in the frame dimension end = [original_shape[0], original_shape[1], original_shape[2], original_shape[3]] strides = [frame, 1, 1, 1] # step 'frame' in the Frame dimension x = tf.strided_slice(x, begin, end, strides) expected_frames = (original_shape[0]) // frame #print(f"VD Expected frames after sub-sampling: {expected_frames}, Actual frames: {x.shape[0]}") if x.shape[0] != expected_frames: raise ValueError(f"Expected frames: {expected_frames}, but got {x.shape[0]}") return x def pooling(self, x, ksize): if ksize == 1: return x T, H, W, C = x.shape Hd = H // ksize Wd = W // ksize # Reshape the tensor to merge the spatial dimensions into the pooling blocks x_reshaped = tf.reshape(x, (T, Hd, ksize, Wd, ksize, C)) # Take the mean across the dimensions 3 and 5, which are the spatial dimensions within each block pooled_x = tf.reduce_mean(x_reshaped, axis=[2, 4]) return pooled_x def __len__(self): return len(self.files) def __getitem__(self, idx): #print("Calling VD getitem method") serialized_tensor = tf.io.read_file(self.files[idx]) video_tensor = tf.io.parse_tensor(serialized_tensor, out_type=tf.float32) x1 = video_tensor x2 = self.sub_sample(x1) x3 = self.sub_sample(x2) x4 = self.sub_sample(x3) #print("\n") x1 = self.pooling(x1, 8) x2 = self.pooling(x2, 4) x3 = self.pooling(x3, 2) #print(f"Shapes of VD output = {x1.shape}, {x2.shape}, {x3.shape}, {x4.shape}") return (x1, x2, x3, x4) def __iter__(self): print(f"Calling VD iter method, len self = {len(self)}") #Make the dataset iterable, allowing it to be used directly with tf.data.Dataset.from_generator. for idx in range(len(self)): yield self[idx] 
The issue is happening at one point when the dataloader is fetching examples from Videodataset in my opinion, I just can't figure out what is causing it.
TLDR
I am using a runpod VM with an NVIDIA A100 GPU. I am trying to train a GAN that outputs 2 second long gifs that are made up fo 16 frames. One of the training step involves splitting the output video (either real or fake) into 4 sub videos of different frame length and resolution. The reduction of frames is achieve by a sub-sample function (which you can find earlier in my post, it is bolded) that starts at the first or second frame of the video (random) and then selects every other frame, so it halves the frames. So I am essentially doing a strided slice on a tensor, and I am using tf.strided_slice(). I tried using regular slicing notation (like you would use in NumPy), and I get the same error. The weird thing about this is that the issue does NOT come up immediately in training and is dependent on batch size. The training goes through several batch iterations just fine (and sometimes some epochs) with a batch size of 16. If I lower the batch size to 8 it's absle to go thorugh even more iterations, even up to 5 epochs (I didn't test it for longer), although the outputs are not the outputs I would expect after some epochs (I expect a specific type of noisy image based on how this model ran in PyTorch and Chainer frameworks, but I instead get a video that's mostly just a black blob through most of the resolution, just a bit of color on the edges). If I go down to a batch size of 4 the issue goes away mostly. See below for the error I am seeing:
Error:
Expected begin and size arguments to be 1-D tensors of size 2, but got shapes [4] and [2] instead. [Op:StridedSlice]
submitted by Titty_Slicer_5000 to MachineLearning [link] [comments]


2024.04.28 22:24 ProfessorZik-Chil theories chart

theories chart submitted by ProfessorZik-Chil to generatorrex [link] [comments]


2024.04.28 22:23 dynam1keNL mikefive, a Kailh PG1316 keyboard

mikefive, a Kailh PG1316 keyboard
I present you, my second keyboard project, and my first full custom project: the mikefive. If you like, read below how it came to be and more details about the build.
https://preview.redd.it/9ze3tggr1axc1.jpg?width=4096&format=pjpg&auto=webp&s=d58455d5912634e81660870740edf206309f01cd
The first keyboard project was a typeractive wireless Corne which I built about a year ago. After typing 6-finger-qwerty for my whole 38-year life, I switched to ortho, split, colemak-dh, blank keycaps, learned to touch type, and never looked back. However, I found that I was always orienting the Corne halves the same way and started thinking about an unibody.
A friend from work liked my 'alternative' keyboard and wanted to build something too. I showed him the rabbit hole including switch options and also showed him the Kailh X (PG1425) switches. These, and especially the keycaps, were hard to come by but we liked the idea of a slim keyboard, so we decided to email Kailh directly. To our surprise, we could order X switches and caps directly from them, although there was a somewhat high MOQ (minimum order quantity). So, him, me and my friends’ housemate decided to order together.
But Kailh suddenly said: “Are you also interested in these PG1316 switches?”. I never heard of those, but the spec sheet they sent looked interesting: tactile, even lower than the X switches, and completely surface mounted on PCB. Officially, these are laptop switches. But hey, potentially this could become something really slim. So we decided to order a sample batch of these too.
My friend continued his design for the X switches and Chocs (PG1350), but when the Kailh box arrived, and I saw and felt the PG1316’s, I knew I wanted to build a keeb with those. I learned to make PCB’s with help from Joe Scotto’s YouTube video and KiCad library, and the same friend who happens to be a mechatronic engineer. I am an industrial product design engineer, so I know my way around 3D CAD and product design.
And, here we are. The mikefive, which gets its name from, well.. me, and its complete thickness of 5mm. Including the keycap, the switch stands 4.2mm tall and is mounted on a 0.8mm thick PCB, making a total of 5mm. The switch has a travel of 1.8mm, and magically disappears completely inside the keycap volume when pressed. In the picture below you can see how thin it is, compared to my Corne with Chocs.
https://preview.redd.it/66j1j9eu1axc1.jpg?width=4096&format=pjpg&auto=webp&s=0557230fe08dd332c5db38998175a5e4eaf11e5c
Because the switch is surface mounted there are no solder pins sticking through the PCB and the PCB can be safely used as a bottom plate without exposing any contacts.
Kailh was nice enough to send the 3D CAD files of the switch and cap so I could use it for checking the fit in KiCad as well as make some nice renders to make design choices a bit easier. Here is a render of the final design before I ordered. Note how I made the bottom edge of the housing near the thumb clusters a little lower then the other edges so the user thumbs will not interfere with the edge there.
https://preview.redd.it/2gvasd483axc1.jpg?width=2000&format=pjpg&auto=webp&s=d930be16c29b186335aec58fa102e9704c0bd364
I chose a 17x17mm spacing, sometimes referred to as CFX spacing. This is 1mm narrower than the 18x17mm Choc spacing I was used too. The choice was primarily based on the square PG1316 keycaps, because I dislike unequal spacing between keycaps. I 3D printed a mockup and the CFX spacing felt very workable, so I went with it. The PCB’s and the CNC’ed aluminum housing are both from JLC. I did some splatter artwork on the back of the PCB including an isolated solder pad in the shape of the logo.
https://preview.redd.it/0qy7iolv1axc1.jpg?width=4096&format=pjpg&auto=webp&s=da29fdd8a83da84a3c320a6f437128f7e731c69c
Soldering was done all using a Miniware hotplate and solderpaste we have at work. It is impossible to solder the PG1316 switches by iron, because the contacts are located underneath the switch. Four larger contacts on the corners of the switch lock the switch its ‘frame’ to the PCB by solder. I placed vias in these corner pads for a more secure connection to the PCB. Because the hotplate is small, it took some time to solder everything, but is was easy and I enjoyed getting closer to testing it.
https://preview.redd.it/y7501i0x1axc1.jpg?width=4096&format=pjpg&auto=webp&s=8ad295892ab851c8954841c170b138c44c8c6d34
Despite being the thinnest switches I have seen, there is space underneath the switch for a backlight LED, which I did not place. Instead, I used this space for the 1N4148W diodes in SOD-123 package. Soldering with a hotplate is easy and magical as the tiny components magically align by themselves. There is also a popular MSK12C02 power switch to disconnect the battery. The diodes, switch and controller were ordered from splitkb, which is in my tiny country. Bedankt voor de stroopwafeltjes Thomas 😉
https://preview.redd.it/entjmv1z1axc1.jpg?width=4096&format=pjpg&auto=webp&s=19049b31d1d8361db849ba786e10449256571b3a
Next to the extremely low profile switches I also needed to fit a controller and battery. Luckily, my typeractive Corne already showed me the right parts with the super thin nicenanov2 and the 301230 battery that both max out below 3mm. I never saw a through-hole controller mounted flush like this but using the hotplate the soldering was a breeze. I made some small additional pads next to the controller pads (you can see them on the picture above) to check if all the individual pads were connected well using a multimeter.
To my surprise, my first time designing a PCB, first time hotplate soldering, first time making a custom shield in ZMK, everything worked! It was a question whether there would still be a good Bluetooth connection with the metal housing covering the whole center controller, but everything just works perfectly. During PCB design, I removed the ground planes on the PCB locally where the Bluetooth antenna of the nicenano is, and the controller being so close to the bottom probably helps for getting out the radio waves through the bottom.
https://preview.redd.it/rkd3bdy02axc1.jpg?width=4096&format=pjpg&auto=webp&s=fce3fbc1d6c9e468e998e1d5dbd27c77e85263a4
I wanted the case to add as minimal as possible to the keyboard. I primarily wanted the case to stiffen up the relatively thin PCB and protect the surface mounted switches from side impact, when for example dropping it into my bag. That is also why the ‘holes’ are in the keeb, to make the contour is smooth for easy into-backpack-sliding. Each half is at 15 degrees, so 30 degrees total between halves. I experimented with this angle using my Corne and liked it this way. The center piece is as small as it can be for housing of the controller and battery.
https://preview.redd.it/5ywdcgm22axc1.jpg?width=4096&format=pjpg&auto=webp&s=6d84d58cbac9ad3dc291c85a2527aa9460e8e754
The Kailh provided keycaps are transparent, and have the letters A, B, C and D on them from the inside. Probably mold marking from production. I guess these would be painted when used in laptops, and transparent to the light passes through. I decided on the white PCB color and natural aluminum housing to match the current switch appearance a bit.
https://preview.redd.it/loyify842axc1.jpg?width=4096&format=pjpg&auto=webp&s=7a31ce7af3c9a17f5dd104ed45f1f9c7135ae748
There is one slight flaw, and that is that the PCB slightly warped during all of the hotplate soldering from one side. Therefore you can see it lifts slightly out of the housing at the bottom edge. Unfortunately, I did not put a screw there to hold it in place, like I did on each corner and in the middle using countersunk M2x3 torx screws. Yes, I did some manual countersinking using a countersinking drill bit in a 0.8mm PCB to make the bottom fully flat. I made sure to have no copper ground planes around the PCB holes to make countersinking easy, and it was.
So, how does it type? Well, the first thing I noticed, coming from Choc Red (linear 50g), it that PG1316’s are very tactile and very strong. I also have all the tactile Chocs sampled here, but nothing comes close the tactile bump in these. The spec sheet says 60g tactile force and 32g operation force, but actually I question those values. I am getting more used to it as I work with it more, but I think it is still a bit heavy for my taste. I emailed Kailh about my experience, so I am curious what they will say.
But then, the height. It is so comfortable, its incredible. Even with the low profile chocs I had some strain on longer sessions. But this, is incredible. No strain at all. It is like tapping the table surface.
And then there is the portability. This thing is slimmer than your phone or tablet. It slides into you backpack tablet compartment with ease. It is also very light. The case is aluminum, but is all very thin so it weighs nothing.
I am excited about it, and will keep you updated on revisions and such. I can share gerbers and stuff if people want it. Let me know in the comments or send me a message.
submitted by dynam1keNL to ErgoMechKeyboards [link] [comments]


http://rodzice.org/