I'm running down the corridor of a locker-lined school. As I burst up to the class I'm supposed to be in, I'm told it's full, but there's another down the hall I can go to.
This is like a nightmare. It's not my fault, I sweatily told the person on the door, but they didn't seem to really care. Why would they? They're just there to make sure people don't go into the wrong rooms.
But it's not my fault. I said classroom 115 to the guide, but he heard classroom 150 so there I went, missing the start of my class.
Because that's what I'm doing. Attending class, and my first efforts in doing so for nearly two decades have fallen flat. And that's not even the worst of it.
Within 20 minutes I've been told to take a seat at the back of class, lost my place in the task I was supposed to be following because I was sending an email, been told off for something I didn't actually do and had to be hand-held through a task that 12-year-old children are supposed to be able to do.
By the time I need to message someone, explaining where to meet for another briefing, I'm already automatically holding my phone under the desk when typing, such is the level of childhood regression I'm experiencing.
I wish I was editorialising any of this, but sadly this is what happened when I went back to school today to see the new iPad (2018).
Apple thought it would be a lark to have the briefings on the new iPad in the style of classes, complete with itineraries in the style of a school schedule, but all it served to do was make me stressed out.
Even the school smell was the same - wandering around these halls would make anyone feel a natural level of teen angst.
But once I embraced the learning experience, trying to get a sense of whether the new iPad would actually be any good for teaching students, i realised that some facts had wormed their way into my mind thanks to a new way of delivering them in a way I'm not used to.
- Read our hands on review: new iPad (2018)
The first app I tried out was Froggypedia - an AR-enabled title that allows you to study the lifecycle of a frog… and then chop it up when it's dead.
It might sound gross, but amphibian dissection is a stereotypical part of high school life in the US - and it causes a lot of difficulty in finding the frogs, having to actually do something disgusting and the ethical issues surrounding it.
Doing it on the app is still a bit morbid - having to pin back the limbs doesn't feel wonderful - but it's really interesting and still delivers the biological insights needed.
There's a quiz in there too to help you learn what's actually being shown, and it turns out that I messed it up badly. I did actually learn something here.
What did I learn? I learned how a frog grows from spawn to a proper fully-functioning adult, and where all the bits go - as well as learning that the liver plays an important part in digestion. Mum is going to be so proud.
One of the things shown off by Apple on stage with the launch of the new iPad was the way you can use Apple's inbuilt apps to create new takes on a project.
Garageband was demonstrated as one way to do this in a classroom setting, showing how you can use bundles of sounds to create something new - in our case, making a project that enabled us to show our understanding of the first moon landing.
We ran through it at such a rate that I didn't get the chance to listen to all the spacey samples from the '60s, and the speakers on the tablet were too low to hear what was being played in the noisy classroom setting.
However, the learning process wasn't the use of the app, but more being able to interact with the moon landing in a new way. As part of a wider project, I can see this working well.
What did I learn? Not a great deal, to be honest. I liked finding the loops in the sound package that sounded cool, but I didn't feel I gained a huge deal of insight into the moon landing in my short time.
My friend did put 'some sick dubstep beats over Twinkle, Twinkle, Little Star,' so perhaps I just wasn't trying hard enough.
Waypoint is a 'standard' AR app, where you fire up the title and hold the iPad's camera over a book, with it suddenly coming to life.
As part of a small project, you can hover the camera over books to do with Ancient Rome and see things in a new light. I got to examine the Colosseum from different angles, seeing how it looked in its pomp, rather than just having to study it from pictures or artists' renderings.
It was the recreation of the Mount Vesuvius eruption that got me thinking though - I had no idea there was a second city destroyed in the event, but Herculaneum was covered in the volcanic pyroclastic flows just like Pompeii.
However, the latter city got all the fame and the former no word in modern history books - but being able to see their location and how they got destroyed was interesting.
I also got to put sunglasses on Julius Caesar and come up with my own motto, but I can do that any day of the week.
What did I learn? I learned that I've been in the dark about the eruption all my life, and the people of Herculaneum need to be remembered. Plus I learned the word pyroclastic, which is fun to say.
One of the big plays from Apple in the education space is getting students to learn to code - Swift Playgrounds does this easily, and has the added benefit for the brand of getting students understanding how to make apps for its platform.
Many of the 'lessons' I undertook at the Apple event were around the use of the new coding platform, but putting them out in the real world.
Firstly, I had to control Sphero through a model city, navigating the little ball through the blocks to get past cars and other obstacles blocking its way.
It's harder than I thought. Even with the commands already programmed in for me, such as moving left and right, I still had trouble working out how to 'code' the distance in for how far the ball should roll.
The person teaching me gave me the kind of instructions you'd give a young child, and I still struggled a little… there's definitely something here that I need to learn more on.
Also, my Sphero wouldn't quite follow the roads laid out, which irritated me no end.
The other Swift Playgrounds option was for making a robot dance - I've played with this one before, but the idea of looping the moves and deciding which to put where was a little complicated still for my newbie mind.
Making the robot boogie based on my wishes was cool - and for kids, that's a great little reward at the end.
It would be interesting to see how to actually create the moves that I just dragged and dropped in sequence, but given I'm clearly poor at coding, it's probably best I don't open that door just yet.
What did I learn? Again, not a lot. The thing that comes across with Swift Playgrounds is the need to run code in order, and how to alter the modes within each lot of coding.
It's not complex, but even still I struggled… the main thing I learned is I'm poor at coding and I need to get better if I don't want to get left behind in the future when all kids are skilled at it.
Parrot Education was on show too, and it also used the Swift Playgrounds framework. But this was an app outside of the Playgrounds basic platform offered from Apple, but using the same command.
I also got to fly a drone through a hoop based on my own choices on where to fly, for how long, and where.
It was frustrating that it kept moving to the left because of some slight gusts, but who cares… drones are too much fun.
What did I learn? I learned that some things are out of my control, but that trial and error is important in coding, helping you understand that not everything can be learned. I didn't really care because I was playing with a drone… and that had me engaged already.
- It's already happened: we've done our hands on: new iPad (2018) review