The Immersive Lens Podcast

Paul Engin | Dave Ghidiu | Jeff Kidd

Episode 9: Accessibility


 

In this episode of "The Immersive Lens," hosts Paul Engin and Dave Ghidiu kick things off by discussing recent tech and AI developments. They explore how AI is being trained on physics rather than previous human designs to create highly efficient printed circuit boards, effectively avoiding human biases and errors. The hosts also recount Paul's recent trip to London where he experienced new AR glasses and gesture-controlled mini drones, before shifting to the challenges of spotting AI-generated content in fake 1980s He-Man movie trailers and YouTube shorts. Additionally, they briefly touch on Google Stitch, a web-based application used for rapid app UI prototyping.

The episode's deep dive centers on digital accessibility and "universal design" - the concept that designing for disabilities, such as adding closed captions or physical "curb cuts," ultimately benefits everyone. The hosts share practical advice for making educational and digital content accessible to comply with mandates like Title II of the ADA. They recommend using AI to generate highly descriptive alt text for images, utilizing accessibility checkers in programs like Adobe Acrobat and Word, and improving visual accessibility through high-contrast dark modes and legible sans-serif fonts like Atkinson Hyperlegible. Finally, they highlight the robust built-in accessibility features found across Mac, Chromebook, and mobile devices, concluding that designing for accessibility is always worth the effort to level up the experience for all users.




Key Topics

AI is Training on Physics to Overcome Human Bias: Instead of training AI on existing human-designed printed circuit boards, which contain errors and limit capabilities, developers are successfully training models on real-time physics. This new approach avoids human biases and has drastically reduced the design time for circuit boards from over 400 hours down to just 38 hours.

AI-Generated Video is Becoming Harder to Detect: Convincing AI-generated videos, such as fake 1980s He-Man movie trailers and realistic YouTube shorts, are making it increasingly difficult for viewers to identify authentic content. This growing challenge highlights the potential need for platforms to implement better algorithms or mandatory disclosures to identify AI creations.

Universal Design Benefits All Users: Digital accessibility goes beyond legal compliance; the concept of "universal design" demonstrates that designing for disabilities ultimately improves the experience for everyone. Everyday examples like physical curb cuts, high-contrast dark mode, and closed captions were originally designed for specific accessibility needs but have become widely utilized features that the general public relies on.

Creators Have Powerful Tools to Improve Accessibility: Content creators are encouraged to use built-in accessibility checkers in software like Adobe Acrobat or Microsoft Word to easily identify and fix issues in their documents. Furthermore, creators can leverage AI tools to generate highly descriptive alt text for images, and they should opt for highly legible, sans-serif fonts like Atkinson Hyperlegible to make reading easier for all audiences.





Transcript

Click here to view transcript
Paul: Can you see this?

Dave: Um, there we go, I can see it now.

Paul: He's getting old, he's having trouble seeing things.

Dave: Not in my backyard, I can't see that. Accessibility.

Paul: That's right.

Paul: Welcome to the Immersive Lens, the podcast exploring the technologies reshaping how we live, work, and learn.

Dave: From AI and virtual reality to creative media and design, we're diving into the tools and ideas shaping our connected world.

Paul: My name is Paul Engin, join us as we uncover the people and ideas driving the next wave of interactive experiences.

Dave: And I'm Dave Ghidiu. This is The Immersive Lens.

Paul: Hey Dave, how are you doing, what have you been up to, anything exciting?

Dave: I'm doing well, I'm doing well, we're coming off break, the winter break.

Paul: I did see something fascinating about AI and I've been doing a lot of research into how training is done, so we gave that webinar last week about bias in AI bias, and we had that podcast that just dropped last week as well. So one of the ways that training has gone wrong, and I'm going to use a specific example, this was an article that I saw in VentureBeat, so I'll make sure that's in the liner notes or the episode notes for this. But there was a startup, one of the problems with manufacturing PCBs, so that's the printed circuit boards, so almost everything in the room that we're in right now, everything has a printed circuit board. And you know, you take things apart, it's the green thing with the copper lines and all that stuff.

Dave: Linking everything.

Paul: Yeah, so there's trying to get AI to do better layouts. So whenever a board needs to be designed, there's kind of three stages. The first is the schematic, so someone draws like here's where the resistors go, or what you need. And then someone actually manually draws the physical layout in CAD software.

Dave: Yes, and Fusion has a module where you can do that. It's pretty cool. And then you can send it to someone to manufacture.

Paul: Oh really? So you can make your own?

Dave: Yeah.

Paul: Oh okay, you don't need AI.

Dave: But go ahead, go ahead, go ahead.

Paul: And so then the third thing is the manufacturer, and that part's pretty easy. But the layout, that middle step, creates a bottleneck and there's a few reasons for that. And it can take up to four to eight weeks to create just one board to like lay it out because you can't have things too close because they'll short circuit because we don't have the technology to make them small enough, or you can't have things, you know, but you do want it small. So there's an awful lot of specifications. So they thought about training AI on existing circuit boards and they found that that was not a viable option and there's three reasons why. So again, this is an example of biased AI training. The first reason was because humans make frequent errors, and so the best, so like if we were looking at these boards and training them, we'd be training them with errors in them. And the best designs aren't even publicly available. So when companies really, really crank out great circuit boards, it's not like you can train on that because there's, it's not on Wikipedia, it's not anywhere.

Dave: Right, right.

Paul: And the third thing is they realized they're like, well, if we train on existing boards, not only might there be errors in it, but also we're limiting AI, it's not, we're not having it push past what the human capabilities are. Because it recognizes it's a very niche set anyhow.

Dave: Right, absolutely. But how beneficial would it be if they could make it more efficient?

Paul: I'll let you keep going, go ahead, go ahead.

Dave: That's perfect timing. So what they're doing instead is training the model on physics, like actual physics, and then saying, design this board without giving it too much insight into what boards look like. So they're not even using PCB training for PCB generation, they're just giving it real-time physics problems to solve, which I think is kind of neat.

Paul: That is awesome. And so are they finding that successful?

Dave: So far, yeah, the company is Quilter and they built an actual, they built a machine which normally in the past had taken about 420 hours down to 38 hours.

Paul: That is, that is amazing.

Dave: Oh yeah, it's 90 percent efficient.

Paul: Yeah, the intricacies that go into building a circuit board with all the paths and transistors and connections is just, I mean, that seems like a great place for AI to make things more efficient.

Dave: Yeah.

Paul: Cause they have all the knowledge of previous boards too and what doesn't work.

Dave: Right.

Paul: And that might be uncorking maybe a new way of training, or a new way of thinking about models. And I'm sure other companies are doing this where we just say like, hey, here's a problem to be solved, don't look at what we've done in the past, we'll tell you what didn't work, but like here's physics, like understand physics and then you can understand maybe unlock all this potential. So that's what I've been thinking about.

Dave: That's really cool. Yeah. How are you doing, man? What, I haven't seen you in a few weeks.

Paul: Yeah, I had an opportunity to get together with all the kids, so that was really good. One of my kids is out in London, so we went to across the pond, across the pond, the five-hour pond.

Dave: Holy smokes. Across the pond.

Paul: Yep. My favorite phrase from my trip is "mind the gap".

Dave: Mind the gap on the tube, right?

Paul: On the tube. Yep. So it was a really great trip, and you know your daughter knows you when she makes one of the stops a place called Selfridges.

Dave: Selfridges.

Paul: Yeah, so Jeff, you would die for this place, it is like a B&H, it's a...

Dave: And B&H is that's a place in New York City that has like all the cameras and the equipment, tech, yeah.

Paul: And so it's a Macy's, think of it as a Macy's, and on the basement they had all of the electronics. So they had holograms, all the DJI drones, they had derivative drones, they had mini drones, and it got me, and I'm going to talk to Jeff about this later, but they had a mini drone and literally just moved it up and through gestures, it tracked him and he walked down the aisle and it recorded him walking down the aisle, and then he turned around and then it followed him, and then he just extended his hand out and it landed on his hand.

Dave: So we don't need selfie sticks anymore?

Paul: No, you can just do these little mini drones and it goes up. But I also had an opportunity while I was there, I know, people are like, you went to London and you went to play with drones.

Dave: We did all the traditional stuff too, but Big Ben, Parliament.

Paul: Big Ben, Parliament, yeah. So we did all the sightseeing stuff and it was a really great trip, but while I was at Selfridges I also had a chance to look at all the AR glasses.

Dave: Oh. I've only seen those in red bottom. So those are we talked about those before, but those are glasses that it projects like a little bit on the screen on the lens.

Paul: Yes. And it was really eye-opening. One of them wasn't really, I don't consider them AR, so augmented reality, but what it was was something you would plug into your computer or your phone and you could see it as a second screen. So I could... so, but it looked like regular glasses. So it wasn't Vision Pro, like, you know, the big Apple bulky headset. It was actually just a regular, and I was like, oh my gosh, I could, this could totally be something that someone could use, and it's plugged in so there's no battery, it's just literally a viewer with regular glasses. And then I had a chance to look at the older generation of Meta glasses, they don't have the latest one. But it was really cool. I put them on and I was able to like look at my watch and say, what am I looking at? I said, you know, "Hey Meta, what am I looking at?" And it said, it looks like you're looking at your wrist, and on your wrist is a, you know, ultra, you know.

Dave: Oh, it knew like what model watch you had?

Paul: Yeah, and it could tell you a little bit about what you're looking at. So I was like, oh my gosh, okay, I could see, you know, think about industry or anything else where you can just look at something and go, all right, I'm trying to figure out what I'm going to do here. Can you take a look at this, you know, "Hey Meta, tell me..."

Dave: Working on a car.

Paul: Yeah. And so I think there's a lot of potential there, and again, now you have the displays that are being added to it. So.

Dave: Yeah, that'd be great for like any, especially in the manufacturing, you know, things come down the line and it just tells you, plug this thing into here or like if I'm working on something in my car. Yeah, oh I like that. It's like having YouTube in front of you, on demand. Customized.

Dave: And then. So, so that was a great trip, great experience, got to see all that stuff, got to play with some toys, and that was really good. And then yesterday I saw something that came out which gave me a huge throwback and I'm hoping you and Jeff are old enough to remember He-Man.

Paul: (Humming theme) Of course.

Dave: So Jeff, do you know He-Man?

Jeff Kidd: I do.

Dave: Okay, so the trailer, they gave a teaser and then today the trailer came out, and I've been in this 80s nostalgia phase as I'm driving here, I'm hearing all these 80s songs and I got to watch the trailer, I'm getting excited.

Paul: Correct me if I'm wrong, so I saw some trailers and it was really hard to tell what was real and what wasn't.

Dave: That's what I wanted to tell you about.

Paul: But one of the trailers started with like "In the 80s" and it had like clips from the 80s like commercials that I recognized like the cereal and stuff.

Dave: Yes, that was the real...

Paul: They had me at like, I was like, this is my jam.

Dave: Yeah, that was, that was such a great teaser. But that was what I wanted to talk about, when I was looking for the trailer today, I went through six different videos that were all AI generated based on the real trailer.

Paul: It did not take a lot of time for that to happen.

Dave: No. And I think to myself, alright, there needs to be an algorithm added to YouTube or something that is like filtering AI when things like this...

Paul: There should be. You're supposed to disclose it.

Dave: It was very frustrating. The only way I got, like I'd look at the comment right away and it'd say this is AI, this is AI slop, this is AI crap, you know, it just went down and I was like okay, so next one, next one. I finally found the trailer, but it was funny because the first one I watched I got so excited, I was like okay, and then I was like something's off, the audio seems weird.

Paul: Yeah. Like you could tell like... and I'm like, oh crop, this isn't it.

Dave: Yeah, I fell for it.

Paul: And I usually look at like the who posts the videos, like Marvel Studios or Kino Central or whatever, but even that I feel like it's harder and harder to do. In fact I fell, I was watching this video as a YouTube short, it was like 90 seconds long, and it was this guy who like in the woods and he had, it was one of those time-lapse ones where he's wearing a GoPro sometimes, sometimes sets the camera up and he's like building something. I was like oh this is so cool, he's like digging things out of a stump and he turned it and like he put concrete down there, like built out this tunnel underneath it and put like a hatch on it, I was like this is so cool. And I showed it to my wife, she's like, you're an idiot, that's AI. I was like, oh no, is it really? And sure enough...

Dave: Because I think you want to, you want to believe it's real.

Paul: Yeah, and it was happening so fast, like had it been normal speed I would have probably caught it. But like the tree stump changed sizes and like it got taller and shorter and I was like oh man. And then he puts like a refrigerator down there and I'm like, how'd they get the refrigerator down that tiny hole? And I was like okay, yeah.

Dave: Alright, yeah, yeah, yeah.

Paul: Man, but you got to be on the lookout.

Dave: Yeah, and then lastly I wanted to let you know that I did try Stitch and from Google.

Paul: Oh, the Google one. Yeah.

Dave: And so...

Paul: Remind us what Stitch is.

Dave: So Stitch is a Google application, it's all web-based, but when you log in you can tell it what kind of UI or interface you want it to design for a specific app. So if you're trying to build a web app or a standard phone app, you can kind of just tell it the type of design you want and it'll give you variations, and it is amazing. It's equivalent to, if you've used Figma before, wireframing or prototyping, it's a quick way to do a quick prototype. And it gave me derivatives of that, and it was really eye-opening, and I was thinking to myself, oh my gosh, I was just able in under five minutes to get six different style variations of an app I'm thinking about developing, and they all look good. So now I'm torn between... I mean, it did an amazing job with it.

Paul: But and you might be able to like AB test them or whatever too. And they actually have, they released a suite of like 10 tools, Stitch is one of them for marketing and for production and design.

Dave: Google?

Paul: Yeah.

Dave: Oh my gosh, they are coming out with so many things so quickly, but alright, we got to get to the deep dive topic today, which is accessibility.

Dave: And...

Paul: What do you mean by accessibility?

Dave: So, there's different forms of accessibility. Before I joined FLCC, I used to do corporate training and anytime we'd do training for like a government agency, we had to comply with what's known as 508 compliance. Which is basically a federal mandate that says that the content you create for the government needs to be accessible by people who aren't necessarily, have like, who might be seeing impaired or hearing impaired, and it needs to be accessible to them. So things like closed captioning or transcripts anytime there's a video, they needed to be added to it, and then things like if you needed to, it's called alt-tabbing, so you can jump around your screens and there needs to be some kind of description that says you're here and describing it.

Paul: Oh, so if you're filling out a form or something.

Dave: Okay. Yep, and it actually has to be for your whole interface, so if you have a navigation, it's not just for a form. Um, but I know we're going through something here in education, and what's that compliance?

Paul: The Title II, so this is the Title II of the Americans with Disabilities Act, and it requires state and local governments to provide people with disabilities an equal opportunity to benefit from all their programs, services, activities. And so we need to go through all our content and make sure that it's accessible by everyone.

Dave: Right. Which is a great exercise.

Paul: It is, it is, and I think that there's, there's benefits, I think there is a little more work that has to be put into it, but I think it, and there's a positive outcome for all of this afterwards because now you've built a pretty robust content.

Dave: Yeah, and it makes me think of, as were going through this, this process at the college, it makes me think of universal design, and there was this fantastic episode of 99% Invisible. Friends of the pod, friends of the pod. And...

Paul: What's 99... I don't even know what that is.

Dave: It's a podcast, and it's about design. I'm surprised you don't listen to it. It's about design. But they did an episode about universal design, and the canonical example they give are curb cuts. So curb cuts, like when you're walking in cities, are kind of the ramps that go down from the curb down to the street level, and they were initially designed for wheelchairs because otherwise how could you get up and down the curb. So they have these curb cuts, and it seemed like people were like, oh, well this is just going to like disrupt, I might trip over or whatever. But it turns out, people with strollers love curb cuts. People on bikes love curb cuts. People who walk with their groceries either with a cart or dragging the cart love curb cuts. So curb cuts are a great example of something that was designed for accessibility, but has really had these profound implications for everyone. And so that's kind of the golden standard when we're doing any design, especially in education, is to make it available for everyone.

Paul: Right. And I think that that's really good, you know, as, as we're going through this, and I know I labeled, like I mentioned closed captioning, but there's more to accessibility than just the closed captioning. I think it's trying to create a similar experience for everybody, no matter what their disability is. Uh, so when they're going through content, they can have a similar experience, if that makes sense.

Dave: Yeah. And the captions, I'm glad you said captions, because that's another great example of universal design. So it's obviously for people who are deaf or hard of hearing, they could read the captions, but it's great for if you have your TV on and you don't have the volume on. And that happens, I don't know, say in waiting rooms, but uh, a great example for my learners are, you know, if they're, you know, sitting with their kids at night trying to get their kids to go to sleep and they're watching a YouTube video, put the captions on and you don't have to have the volume up. So there's, or for learners who maybe English isn't their first language, now you can get captions in multiple languages. So it's, it's universal design baby.

Paul: Yes, and I, you know, it's funny you mention that captioning because I feel like that's being used more and more for general use. So even when I'm listening to, you know, we'll watch a show that's got a strong British accent, sometimes it's tough to understand, so we throw on captioning, um, so we can get a, you know, we understand what's being said in the show. Um, so I, I think they, you know, there's a study that they're finding more people are turning on captioning when they're watching movies at home.

Dave: Oh absolutely, and part of it is because of limitations like the way that we mix audio now is different. Uh, but there's all sorts of reasons, but it is great. Uh, and if you're living without captions, like I at first I thought they were really annoying, but now we have captions on I'm like, oh this is great.

Paul: Right, I agree. I agree. And then, so, uh, I mentioned alt tags a little bit. Do you want to talk a little bit about what those are and, and, you know, how you could integrate those?

Dave: Yeah, so alt tags are alternative text. So for instance, again, if you're perhaps blind or visually impaired, and you're on a website and there's an image, your screen reader, and the screen reader is the kind of voice that is reading the items on the screen, the text for you, right, and it will say "image". So if it just says "image" that doesn't help, that doesn't provide the experience that we're hoping for. So you can include alt text. So when you are authoring content, and you can do this in Google Docs and Word as well, if you right-click on it, you can insert alt text, and you should. And you can put a description of the image, so when the screen reader reads it, it will, instead of saying "there's an image here", it will say "there's an image and here's what it is", and it will describe whatever you put in the alt text. And that's actually also universal design, has some great ramifications for, in say developing nations where bandwidth isn't great for websites, or you have like a very limited bandwidth, you can say don't download images because that takes up a lot of space, but if you have the alt text there, they'll get the text so they can still see the intent there. And so as I'm going through all my content and making it accessible, I just take every image and I upload it to AI and I'm like, can you create alt text for this? And so I just have one conversation and then I just get um alt text for all my images.

Paul: Oh that's great. Yeah, I know for web design, um, that's always a discussion point, you know, you add alt text to images, add alt text to descriptors or descriptors of, uh, maybe an illustration of something that is on the page.

Dave: Yeah, and that's another good reminder like your text, your images should never have text on them. And if it does have text you should use alt text to say explicitly what it is. Like the images should not convey like text.

Paul: Yeah. Because I think that one of those things, um, it's funny when my students are doing web design and they're doing the, um, the images, they, um, if there's text in it, I always say, you know, you could do this without without the text being on the image, so this way it's searchable too. So it's like multiple reasons, universal design baby. For that, um...

Paul: The uh, the other thing uh, that I'm, one of the tools that I found actually does a pretty good job at this is Adobe Acrobat as well.

Dave: And that's a PDF.

Paul: Yeah, so um...

Dave: Is that for creating PDFs, reading PDFs?

Paul: Both. Yeah, so you can create, build, edit, uh and uh I found I when I was doing it this semester, um I opened up Acrobat and there's an accessibility checker. And so the accessibility checker allows me to um, go through everything and tag everything, and it has an option for the images, so it gives it...

Dave: So for every PDF you'll go through it and it will say like, I understand this text, I need you to fill in the text for this image?

Paul: Yep.

Dave: Is that why I see sometimes like PDF versus PDFA?

Paul: I don't, that I don't know if the extensions, the, because the it's still dot PDF, um, I just ran it through the checker and, um, because I uploaded it and our, um, learning management system, uh, gave me a little, there's an indicator and it said it was not accessible. And I said, what? What was wrong with it? Um, so then I took it and I found that's how I found the accessibility in Acrobat, um, and then it ran through it, it didn't have alt tags on the images, it didn't have, so I was able to do it, it popped up things, it gave suggestions and I could modify the suggestions um of the descriptors for the images for instance.

Dave: Yeah, so for people out there listening knowing that they have to go through this, was it onerous, was it time consuming, or was it like once you get in the groove you fly through doing it?

Paul: Once I found it, yeah, it was, it was really good. Um, I don't know, so like for instance, when you're doing your alt tags, I don't know what are you, you know, you said you put it in the AI, but then how do is it in a Word document or what are you where are you putting it in Google Docs or...

Dave: I'll take the image from my website or whatever and I'll take that image and upload it to AI and say "hey, and I have a prompt that I use, I'll put that in the show notes", and I'll say "generate text, alt text" and it will, and then I'll just copy that text that AI returned to me and put it in Word or in Google Docs.

Paul: Gotcha. Gotcha.

Dave: And there there is an accessibility checker I think in Word as well, which is good to know.

Paul: Oh that's good. And I think most of these applications are now having accessibility checkers, um, and I know like I just went to Adobe Express which we'll talk about in a second, but um, it's got add-ons for accessibility for not only general accessibility but contrast accessibility.

Dave: And contrast is a big thing. So people don't think about that, but say a lot of times I'll have like a black background with white text and that, the contrast there is pretty good. But then if you start having like an orange background with white text, it gets a little bit harder to read, and you're like, oh, is this gonna pass accessibility?

Paul: Right. And I don't know the threshold, but I, I always tell my students you need that 40 percent difference between uh to be um, somewhat legible. So the higher the contrast, the better it is for people to see, and obviously we have a lot of um, computers and uh our mobile devices now have that dark mode.

Dave: Yeah.

Paul: Which is nice.

Dave: Universal design baby.

Paul: And I do think that that actually stemmed from developers, because if you remember developers used to have always switch their stuff to dark.

Dave: Used to? Still do.

Paul: But before, before we were cool, before it was cool. That's right. But then they said, well everybody's doing it, or they're doing it, so why don't we all like implement this as a possibility for a view. Um, so yeah, so now everyone has an option for like dark mode or light mode or.

Dave: Yep, which is nice. I like it.

Paul: Yeah, no it's great.

Dave: Um, there, and when you were talking about contrast and making things easier to read and harder to read, it made me think of these fonts. So there's a number of fonts, my wife just pointed this one out to me, it's called Atkinson Hyper, eligible, or Hyperlegible. Can you see that?

Paul: Yep, Hyperlegible.

Dave: So look at that down there, and folks you should definitely check this out, but it's there are fonts that are made to be legible. And one example is uh you know the number one versus the capital I versus a lowercase i versus a lowercase l are very hard to read in some fonts, but not in this font, and uh so they really did a great job of making this font super easy to read. And a few of the other notable mentions would be Dyslexie, which is a font that was created for um helping um people with dyslexia read. There's a font called Sans Forgetica. Which, and I don't have an example of it here, but I'll put one in the in the show notes, and it's meant to help you remember what you're reading. So it's missing like some of the strokes, so a capital F might be missing a part of the stroke, so then your brain has to like do extra work to understand it.

Paul: Yes.

Dave: Um, and then Lexend, which is a font that's in available freely also uh in Google, and that was intended to reduce visual stress and improve reading performance. Initially for dyslexia but because of universal design it was great for everyone, so that's kind of a standard font now.

Paul: Yeah, that's, that's, that's interesting um, because we talk about um, a Gestalt principle which is a psycho- psychological um approach to design on some some of these things and one of them is um uh, closure. So uh your mind automatically closes shapes, so even though you don't see like a shape, your mind can make that leap. Um, so uh, it's really interesting that you mention that they've integrated that into a typeface that, you know, that does something like that where you're your mind is processing.

Dave: Yeah. Gestalt, I like that.

Paul: Um, so uh, as far as uh the typeface I agree with you 100 percent. Um, I have a pet peeve as I get older, um, get off my lawn. You know, when you hit, when you hit 28, you know, it starts...

Dave: Ooh. Okay, that's a stretch.

Paul: Uh, but um the uh, the type um, when I was, like I get a box and it gives me instructions for, you know, how to cook. There is uh I had this one that was scripted font, so like a decorative font.

Dave: Yeah, like cursive or something.

Paul: Yeah. And um it was small and I was like I can't even... so you know I have to get my glasses or do the phone magnifier...

Dave: Yeah, the phone magnifier.

Paul: Yeah, and, and to read it, but someone like in the company goes, I'm hoping that someone looked at this and goes why is why is this a scripted font? Because the next version of the box that I got it was a sans-serif font. I could read it and they made it slightly larger, and that's when I when we talk about accessibility, you know, you don't think about it, but that's accessibility right, it makes it easier for someone to consume the content.

Dave: Accessibility is a design challenge.

Paul: Yeah. It is.

Dave: And you said sans-serif, I just to my knowledge, so why don't you tell me what sans-serif means?

Paul: So there is the serif fonts which have a little decorative element on the edges of the, the type.

Dave: Like the tails of the letters?

Paul: Yeah, like a Times New Roman. Um, and then uh the, the Atkinson um Hyperlegible is a sans-serif font or um Arial or yeah, Helvetica, those are all sans-serif fonts which don't have the decorative elements. And they found that sans-serif fonts are far easier to read on smaller uh displays. So it's one of the reasons Google shifted from their serifed font a long time ago.

Dave: Oh yeah, back in the 90s?

Paul: Yeah, when they started doing Android, they started moving toward the sans-serif font because obviously more legible on their watch faces, on their phone faces.

Dave: I did not know that.

Paul: So, um, so design was driving that.

Dave: Yeah.

Paul: So it's really interesting because sometimes it's things you don't think about, um, as far as that goes.

Paul: And uh I know we talked about captioning before, um, but YouTube has uh automatic captioning.

Dave: Which is pretty good these days. It didn't used to be, but it's pretty dang good.

Paul: And it's getting better and better. Um, I know like we have to look at the captioning for ours because I think like they don't get our names correct all the time, right? But um I think it does a pretty good job. Adobe Express has uh, which is a web-based um platform where you can just log in. I think there's a free version of it you know that you can just log in and you have limited things you can do.

Dave: And does like some design work or small movies and stuff.

Paul: Like Canva, I'm assuming Canva probably does this too but, um, where you can upload your video and it'll do visual captioning for you as well. So...

Dave: Descriptive audio?

Paul: Descriptive, yep. Well no, it's not descriptive. So this is just captioning so your audio is but it's visible.

Dave: Oh, I see, on the screen. Oh that's how they do that on like TikTok and Shorts and stuff.

Paul: Yes, exactly. And you can time it out so it highlights the word that's being said. Um, but yes, so TikTok is a perfect example and uh, and Shorts, they have the same thing. But you brought up a great point, so there's a difference right between descriptive audio versus CC, right?

Dave: Yeah, the closed caption. So closed caption is what I grew up with where you turn the volume down you can still see the words.

Paul: Correct. And descriptive would be describing what is happening in the video. So I think that there's a distinction here where it's, and I'm not sure, I haven't really explored this too much because we have descrip- that's really what the alt tag is right, if you think about alt tags for an image, you have two options. There is usually if it's a decorative element, you don't need to do a descriptive tag.

Dave: If it's like a swirl or a line or something.

Paul: Right. But if it is an image that's relevant to in this case we'll say learning, you need to describe this is a bar chart of uh blah blah blah blah that shows XYZ. So this way if someone who can't see it, they can hear what the image is is is conveying if that makes sense. And that's descriptive.

Dave: So let me throw this at you. Watching The Lion King. And there's that scene where like Simba's born and then like uh Rafiki like holds him up on Pride Rock like over everyone. So the music, it was probably like Circle of Life, I don't know. So the close caption would be that, but the descriptive audio would be like, there is a ceremoni- a monkey ceremonially holding a little lion cub above like, okay.

Paul: Yeah, with other animals watching in a triumphant manner.

Dave: And bowing, and bowing. Okay. So you know your Lion King. Okay.

Paul: That's all this was, was a test of uh... of my my Lion King knowledge.

Dave: I passed. You passed.

Paul: Um, and I think that there, you know, one of the things we can think about is, um, you know, most Macs have accessibility and, um, I know you'll talk a little bit in a second about your Chromebook, but um, I know one of the things that I use for my Mac accessibility is someone who has, um, a seeing impairment and they need larger fonts. Um, some of the things that I tend to do with it is I'll show them that they can increase the the type face, so I'll set up their system so when they log on it's a bigger view. Um, and sometimes the other thing I'll do is I'll bring a laptop in and laptops sometimes don't have the display resolution that they might need. So I'll HDMI them to a monitor that's in front of them so they can see it larger.

Dave: And it just mirrors it.

Paul: And it just mirrors it. If that makes sense. So it's those little things. And then for me when I'm teaching, I do it all the time, and Jeff helped me with the setting up my my my room, but it allows me an accessibility feature to zoom in, so when I'm talking about a specific element on the screen, everybody in the room I'm zooming in so they can see it versus like being in the back of the room going what what what's he pointing at?

Dave: Yeah, and that's something, and it's very easy to do, you can pinch and zoom on a Mac on most applications and zoom in to right what you're doing, so it fills up the whole screen with whatever you're talking about.

Paul: Yeah, but but and this is this is um doing it so you can like, like and I you can't see it there, but I'm holding the control key and I'm zooming in so I can really zoom into a specific UI place or specific thing where I'm like, you know, clicking on here so everybody in the room can see what I'm talking about.

Dave: Yeah, and I do that when I'm making videos too, I'll kind of like pinch to zoom so I can highlight the specific thing. Um, but you're right, Chromebooks have accessibility, they like they have a menu and so unlike Windows, and I don't know how it is on a Mac, but in Windows accessibility is in like three different places. But on a Chromebook it's right here and they have like ChromeVox which is your spoken feedback, and everything has a toggle. Or select to speak, or dictation, uh it even has face controls, but one of my favorite ones that I use and magnifiers color inversion all that stuff. Right. Is it my cursor has a red circle around it. So it's not it's very easy to see where I'm pointing at any given time. And I initially started doing that when I was making videos for my online learners so that they could it much easier to follow a red circle than a cursor, but I got used to it and now I just leave it there and I love it. So I always know exactly where my cursor is.

Paul: That's a great yeah, it's those little things that you don't think about.

Dave: Universal design baby.

Paul: Yeah. It's it's great. And uh I know like iPhones have accessibility as well.

Dave: I think mobile phones generally are way better than anything else for accessibility.

Paul: I agree because I think that they kept that in mind once we you know, once they started developing it, they started doing it so like you can go to accessibility on an Android or on a on an iPhone and you can you know make your text bigger, you can make your uh contrast higher, you can do audio text, you can even change the um the way Face ID works and lots of different things. Um, so you have a lot of control over it. They have eye tracking, and on this is iPhone specifically, so you can just use your eyes as a cursor, you know like so there's tons of different um I mean this has head tracking, eye tracking, voice control. Uh, so lots of different things that you can do and obviously something to definitely explore you know on your device if if you're running into things or you're having difficulty seeing something or you know there is probably on your phone, like you said it's been thought of probably even before your computer or anything else.

Dave: Yeah. And I I'm glad that it's making mainstream too. Like captions are big everywhere, dark mode is big everywhere, like people are starting to see like these features that have always existed are now like really relevant and they make your life easier.

Paul: Yes, yes, absolutely. And it's it's interesting because I think people don't think about it enough, but when they do it really is a an amazing thing to think about that you know this close captioning was done because accessibility but now it's just people do it just in general just for them for themselves so it's easier. And that whole logic of because now it's easier for you to understand what's happening, that's what helps other people that have a an issue that maybe they can't hear it or you know what I mean.

Paul: So, what what's your takeaway uh for accessibility as far as if somebody is thinking about doing accessibility as a requirement, um, what what would you say about first in general is it worth it, and then second, what are some tools or something that you might have used or something, you know, you can think about?

Dave: I would say yes, it's totally worth it. Uh, for accessibility um just so everyone can access your content. But also because of universal design it just makes the experience better, it just levels it up for everyone. And the the from the work that I do for anyone who's out there doing education or kind of content creation is use alt tags, be very descriptive, and now that we have AI to do that, like my prompt is like "be exceedingly descriptive". So uh whereas before I might put one or two sentences, now I can have four, five, six sentences. Uh so I would just say use AI to do that. And then just make sure that your captioning whatever you're captioning if you're making videos. And the auto captioning is 99 percent accurate these days, so just go through there, you know if it's a if it's a video that um a lot of people are watching just make sure that you just like brush it up if you need.

Paul: And I would say you know when we when we talk about the captioning um, and I tell my students this too, when they're doing a site and they want to move video. You can absolutely add video to your server, you can host it, but I tell them there's a benefit of uploading it to YouTube because some things people don't think about is um accessibility as far as uh bandwidth. So um I did some grant work with an institution and it was going to third world countries where they had very limited bandwidth. Sure. And so the content was not, like a standard video even if it's like 100 megabytes here is nothing, it's huge there. There it's just not not feasible. So um I one of those things I thought about is okay I'm going to compress the Jesus out of this, but the other thing is we can um do it so we can upload it to YouTube and it gives different variations based on your bandwidth and it automatically adjusts the resolution.

Dave: Yeah, yeah. That's cool. I didn't know that.

Paul: So um so that's a benefit. I think the other thing that we did as an alternative is we did text and we did just audio. So the audio file is smaller, so they have that option as well. So...

Dave: We're doing that. Cuz we have HD video. We have the podcast version. And uh you can go to our website and get the uh transcript, the text version.

Paul: That's right, so so you have everything. We are accessible folks. And and the close captioning is only going to get better with each, you know, I mean YouTube is just coming out with the algorithms to make it cleaner and cleaner and cleaner. Um, there's a lot of different tools like I mentioned Acrobat.

Dave: Yeah, I'll have to dig into that, it seems like it's worth it.

Paul: 100 percent. And I'm almost guarantee that any software if it's Word, if it's Google, if it's Canva, that they probably have an accessibility option that does a check and um and will look at different things. And like I said even in Adobe Express you can add an add-on that even does contrast, so color, color accessibility.

Dave: I bet even if it doesn't do it natively there's extensions that you can get too or add-ons for like Word and if you really want to get like deep down into the nitty gritty of the accessibility.

Paul: Yep, 100 percent, 100 percent. Alright, well uh that's all the time we have for today. Um, if you liked what you heard make sure you like and subscribe.

Dave: I'm Dave Ghidiu. If you enjoyed today's conversation, smash that describe, or that subscribe, we talk about that description button, the alt description, uh and let's just be careful out there folks.

Paul: Yes, and uh make sure you share it with your friends and colleagues. Until next time stay curious, stay connected and thanks for looking through the immersive lens with us and everyone mind the gap.

Dave: Mind the gap. This is the accessible immersive lens.

Paul: That's right. This ends, and I just want to stop right now, it says this episode was engineered by Jeff Kidd, and there's a lot of invisible labor that goes on. I started listening to these because these episodes have dropped, and all the people that you know like, we I hear Paul and me on these, but it's like Jeff is working so diligently so thank you so much Jeff.

Paul: Yes, thank you Jeff.

Jeff: Super talented. My pleasure.

Paul: Um, this was recorded at Finger Lakes Community College podcast studios located in beautiful Canandaigua, New York in the heart of the Finger Lakes region offering more than 55 degrees, certificates, micro-credentials, and workforce training programs.

Dave: Thank you also to Public Relations and Communications Marketing and the FLCC AI Hub. That's a little bit more of invisible labor of people doing work under the hood.

Paul: Yes, absolutely. And eager to delve into a passion, discover exciting and immersive opportunities at www.flcc.edu.

Dave: As part of our mission at FLCC, we are committed to making education accessible, innovative, and aligned with the needs of both students and employers.

Paul: The views expressed in this podcast are those of the hosts and guests and do not necessarily reflect the official position of Finger Lakes Community College.

Dave: Music by Den from Pixabay.

Paul: This is The Immersive Lens.

(Testing audio)

Paul: Testing, hello hello. Check, check, check.

(Screaming noise / weird sounds)

Dave: I don't think he's okay.

Paul: I don't know. I'm sorry. Check, check.









full-width