Share AppleVis Podcast
Share to email
Share to Facebook
Share to X
By AppleVis Podcast
4.5
100100 ratings
The podcast currently has 829 episodes available.
In this month's edition of Apple Crunch, Thomas Domville, John Gassman, and Marty Sobo discuss recent Apple news and other topics of interest.
Topics featured in this episode include:
Links:
If you have feedback or questions for the Apple Crunch team, you can reach them at [email protected]
TranscriptDisclaimer: This transcript was generated by Aiko, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome to Apple Crunch for September 2024.…
In this podcast, Thomas Domville reviews and demonstrates the Voices feature, which allows you to customize multiple VoiceOver voices to suit your needs. You can quickly access these voices using the Rotor Actions or the VoiceOver Quick Settings.
How to Add VoiceOver Voices to the Voices Feature on iOS
Now, you can easily switch between your customized VoiceOver voices to enhance your accessibility experience on iOS 18.
TranscriptDisclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome.
My name is Thomas Domville, also known as Anonymouse.
I'm going to be talking about a feature called Voices.
So as you know, we have our primary voice over voice that we use each and every day on our device.
Wouldn't it be great that you are able to access various voices more than just one voice over on the fly?
Yep, you can do that.
It lives right in your rotor, if that's where you would like it to be.
In my case, I have it in my rotor itself.
You can also put that in the voice over quick settings if you wish to.
And I'll be showing you how to add that to your rotor and quick setting if that's something you want to do.
But in my case, whenever I do a podcast, you probably always hear that I use Siri number 4 voice, in short is Noel.
And this is what I use when I do podcasting, but every so often I like to change things up and here's some other voice and that would be Tom and Hans.
So those are my top two voices.
And so in order to access it so quickly and easily, I place the voices in within my rotor.
So let me give you an example of what it sounds like and what it looks like.
So I'm going to access my rotor and I'm going to go to Voices.
Voices, Siri voice 4, default, selected.
So if I swipe up, Tom, primary voice.
I have the Tom primary voice.
Or if I could just swipe up again, Siri voice 4, default.
I'm back to Siri voice number 4.
So this is when I'm talking about how you are able to access voices so easily from your rotor or your quick settings, if that's what you choose to do.
So let me show you how I got that set up.
But before we can do anything, we need to add voices so you can…
In this episode, Tyler demonstrates how to customize the lock screen on iOS, specifically how to remove the flashlight and camera buttons and replace them with other controls.
In addition to viewing the time, date, and notifications, the Lock Screen can be customized to remove or replace the camera and flashlight buttons with other controls, or show certain types of information at a glance, such as upcoming calendar events or current weather conditions. To customize the Lock Screen, perform a one-finger triple-tap on either the time or date, double-tap Customize, and then double-tap “Customize Lock Screen wallpaper.” From here, you can double-tap the Remove buttons for default controls, the “add quick action” button to select alternative controls, or the “Add widget” button to select a widget.
transcriptDisclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hey, Apple Vissers, Tyler here with a quick tip for how to customize the lock screen on iOS.
By default, the iOS lock screen includes the time, date, any notifications received since the device was last used, and at least on devices without a home button, shortcuts to the flashlight and camera functions.
Over the years, the iOS lock screen has gradually become more customizable, with the ability to add widgets introduced with iOS 16 in 2022, and the ability to remove the camera and flashlight functions or replace them with other controls the user might find more useful introduced with iOS 18 in 2024.
If, like me, you don't find the camera or flashlight functions particularly useful, or at least not useful enough to where you would want them to be among the first things you see when you wake your iPhone, you can replace them with other things you might find more useful.
So for me, I replace them with a shortcut to the alarm and also a single action shortcut that I created to set a 20-minute timer.
So when I'm about to work out, I just take out my phone, wake it, unlock it, and double tap the workout timer button on the lock screen.
And when I want to set an alarm, I don't have to go into Control Center or open the clock app or use Siri anymore.
I just double tap the alarm button on the lock screen and I'm taken right there.
So to demonstrate this, I'm going to wake my iPhone now and I'm just going to explain so I don't have to explain while voiceover is talking and compete with that sound.
Once I unlock it, I'm going to triple tap either the time or the date.
Either one works.
You can triple tap or you can double tap and hold either one.
So I'm going to wake my iPhone now.
Do not disturb Friday 1 a.m. Okay, triple tap.
Astronomy wallpaper weather sunrise and sunset widget and clock next alarm widget button and illustration of red, blue and yellow rectangle.
Okay, so if I swipe left astronomy, that's the first element on the screen.
It's the current wallpaper I have.
You can have multiple.
So if you want to have different lock screens, like, for example, if you're working, you might want access to different types of information than if you're just on your own time.
If you want to link focuses, you can do that.
So if you have a work focus, you can have it…
In this episode, Tyler demonstrates some of VoiceOver's command customization capabilities on macOS.
If you find a particular VoiceOver command difficult to perform, or discover a function in the Commands menu that doesn’t have a default command, you can assign your own custom command to it. In addition, you can configure commands to open apps and run scripts, so you don’t have to locate them manually.
Commands can be configured by going to VoiceOver Utility > Commands, selecting the “Command set: user” radio button, and clicking “Custom commands edit.” For ease of navigation, you can choose the type of commands you want to view or change, such as numpad, trackpad, keyboard, etc from the "Filter commands" popup menu, or use the search field to locate a particular command.
In this dialog, commands can be presented in either column view, which organizes commands into categories like general, information, and navigation, or table view, which displays a list of all VoiceOver commands, including user-configured ones, which you can navigate with the up and down arrow keys. To add a command, in column view, locate the command, interact with the table of assignments, and specify your new one using the "Add input" popup menu. To add a command when in table view, click the Add button, interact with the table, and specify the input assignment from the popup menu labeled "None: edited." Then, press VO-Right-Arrow passed an empty cell to another popup menu, and choose the command you want your new input assignment to perform.
transcript:Disclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.:
Tyler here.
With a demonstration and walkthrough of VoiceOver command customization on macOS.
Prior to macOS Sequoia, VoiceOver included several user configurable sets of commands, known as commanders, for the numpad, trackpad, keyboard, and quick nav.
With macOS Sequoia, these commanders have been consolidated into VoiceOver's broader command set, meaning in addition to the existing modifiers that you could use, like the option key for keyboard commander, you can also create your own command assignments using the VoiceOver modifier, which may be useful if you find a particular VoiceOver command difficult to perform, if you find a command in the commands menu, for example, that lacks a default assignment, or if you want to create a custom command to open an application.
So to demonstrate this, I'm going to open VoiceOver utility on my mac with VoF8.
Opening VoiceOver utility.
VoiceOver utility.
VoiceOver utility.
Window.
Utility categories.
C for commanders.
Commands.
Commands.
VRA.
VoiceOver modifier.
Control option or caps lock.
VoiceOver modifier.
VoiceOver modifier.
This is the setting that was located in the general category in prior versions of macOS, but options are the same.
Control option, caps lock, or control option, or caps lock, which is the default.
VRA.
Also control VoiceOver with.
Also control VoiceOver with.
Numpad.
Uncheck.
Checkbox.
Numpad, which was formerly known as Numpad Commander.
If you want to use, if you have a…
In this episode, Thomas Domville gives us an overview of new accessibility features for blind, DeafBlind, and visually impaired users in iOS 18. Topics covered include:
Disclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
An Apple Vis Original What's new in iOS 18 for accessibility?
Hello and welcome.
My name is Thomas Donville, also known as Anonymous, as like every year, a new iOS comes in play and Apple brings out new features for us to enjoy.
So with no exception, this year we got a slew of new features for voiceover users, braille users, and other accessibility features that you might find of interest.
There's a lot in here to pack, so let's go ahead and jump right into it and you can hear for yourself what is new in iOS 18 for accessibility.
Live recognition is now an option within your rotor if you include it.
To include it, you just go to your accessibility and head over to rotors and include the live recognition.
This allows you to access live recognition quickly and easily by simply going to your rotor.
Once enabled, we'll just head over to live recognition within our rotor, where you are able to select one or more of the various detections you would like to be enabled.
As you swipe down or up within the rotor, you get the various detection.
To enable a particular detection, just simply do one finger double tap, double tap again and that'll disable it.
If you dismiss the rotor and keep the detection on, it will now live in with your dynamic island towards the top of your device or you can dismiss it from there as well.
Apple has now enhanced the voice over voice option within rotors, which used to be called language.
This rotor allows you to access the various voices that you have defined within speech under accessibility, but that itself has completely been revamped, which you can find within the voice over in the accessibility.
Double tap on this now.
You now have two section in here, first is your primary voice, which can be in any language, doesn't have to be necessarily English and they can…
Join David Nason, Thomas Domville, Michael Hansen, and Tyler Stephen in this AppleVis Extra as they dive into the highlights of Apple’s ‘Glowtime’ Event.
transcriptDisclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello there and welcome to AppleViz Extra.
This is episode number 100.
My name is Dave Nason and this is a very exciting day guys.
We're back.
I'm joined by Thomas Domville, Tyler Stephen and Michael Hansen from the AppleViz team.
How are you guys?
I'm good.
I'm doing wonderful.
How are you Mr. Dave?
I am great.
It's an exciting day.
It's our first day back online.
Oh yeah, it's been a really, really exciting day and a big day for us and a big day for the community and love to share in just the outpour of support that we've seen on the website and people are back and it's a great day and it's also an Apple keynote day.
Yes, what timing we have, eh?
Well, it's ironic.
It's like we planned it on purpose.
Yes, so where again?
Every time what happens, I say I can't believe another year has gone by, but here we go. iPhone event again and some other stuff too.
It was an interesting event.
It opened actually with a video which had quite a lot of accessibility slash disability references in it.
I don't know if you guys noticed that, if you had the audio description on.
I did and that was very nice that they incorporate that, isn't it?
I mean, they always do like a million gazillion videos, but it just always seemed to be that first one.
That's always the most important one.
So putting accessibility in there was a nice touch.
Yeah, that was cool.
And then Tim came on stage or wherever he was outside somewhere.
He seemed to be shouting a lot at the start.
I don't know if anyone else noticed that, but yeah, he set the scene.
He mentioned Apple intelligence in pretty much his first sentence, I think.
So I think that kind of set the scene for the afternoon or for the morning.
Right.
It was almost like, you know how WWDC, we had this Christmas gift and we opened it.
It's like, oh, Apple intelligence.
We were so excited.
They just went ba ba ba ba ba ba ba all the way down.
I feel like today they just rewrapped the gift and we just reopened it because they just went through the whole same thing over again.
And you know, Google were accused of exactly the same thing at their Pixel event that they just re-advanced.
I thought I was hearing some familiar things.
They were talking about iOS 18 and all of a sudden I was like, wait a minute, this is not anything new.
I mean, okay, you want to re-advance.
Okay, that's kind of how you know you're getting to the end of the announcement of whatever the product is.
They start talking about the software.
They're refreshing you on what to expect.
Yeah, I think that's the thing about AI marketing is that when you're marketing that and it's so much of that is software based, you got to kind of beat the drums, the AI drums as much as you can.
Apple doesn't use the term, specific term…
In this AppleVis Extra, Dave Nason and Thomas Domville engage in a discussion with Bryan Bashin and Hans Jørgen Wiberg from Be My Eyes about the acquisition of AppleVis. They tackle the tough questions, explore how Be My Eyes came to acquire AppleVis, and share insights into the experiences of the AppleVis Editorial Team before and after the acquisition.
TranscriptDisclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello there and welcome to the AppleVis Extra podcast, the first AppleBiz Extra podcast of the new era.
We wondered if we would have another one and we do.
We are so excited.
My name is Dave Nason and I'm joined as so often by Thomas Domville , also known as Anonymous Thomas.
How are you my friends?
Oh boy, that's a loaded question to come in.
How am I doing?
Well, you know, I'm ecstatic, I'm excited.
Gosh, what a whirlwind of emotion and whiplash that we have all gone through, right?
It's been a roller coaster, I think it's fair to say over the last, it hasn't only been a month because it feels like about three years.
Right.
It does feel like it's been going on for years, that's how much impact this has had on us, hasn't it?
It really has.
So I suppose for those who kind of maybe aren't sure, what happens?
And you know, we don't need to go into the absolute weeds, but kind of what happened on a high level in terms of, you know, getting to a point where Appleviz appears to be shutting down.
You know, yeah, let's start from the beginning.
So everybody will be on the same level as we are because we have had a lot of questions, a lot of critiques and I understand because I completely understand, we were in the same boat with you guys.
So what happened, guys?
Let's back up all the way to July.
This is when it all came down, was July 3rd.
I remember that specifically, it's right the day before America's Independence Day.
And we woke up with a bombshell of a news from David and, you know, David mentioned in there, he has talked about the first or thought about it for several months and a lot of people have asked us, we had several months, how come you haven't done anything during that time and to be quite fair to the editorial team is that no, we had zero idea, we had no idea.
It was just like we woke up and this bombshell was set.
And I think I can say it was so dramatic.
I will say that first two days just was a blur because I was still trying to digest and process like what in God's name is that mean for all of us in the community and everything.
But of course, the most important thing out of this whole thing was that David has reached a limit that he spent so much of his personal life into.
And so it was so understandable.
I completely understand where he was coming from.
I completely understand why he made that decision and why we're trying to process all this.
We were trying to figure out as a team what to do.
We had a meeting that weekend that we talked about the team and the steps that we wanted to take.
And during that meeting, David Goodwin was with us and he was very ill at that time.
We had no idea…
In her debut podcast, Tarja will showcase AudioCat for iOS. AudioCat is an accessible audiogames platform designed exclusively for iOS, focusing on immersive audio experiences. The app includes a virtual cat companion that users can name and care for, adding to the interactive fun. Currently, AudioCat features two audiogames: “Wordy,” a word-guessing game, and “Echoes of Valor,” an adventure RPG that delves into moral dilemmas. Players can earn coins and XP by playing these games, which can be used to feed their virtual pet and unlock additional content.
AudioCat on the App Store
https://apps.apple.com/us/app/audiocat/id6502510097
In this episode, Thomas Domville provides a detailed walkthrough of the ElevenLabs Reader for iOS. This versatile app transforms any text content into natural, expressive speech using advanced AI-generated voices. Perfect for articles, ePubs, PDFs, and more, it enables users to enjoy their favorite content on the go. With an extensive and ever-growing library of voices, the app offers a personalized listening experience tailored to any mood or occasion.
ElevenLabs Reader: AI Audio on the App Store
https://apps.apple.com/us/app/elevenlabs-reader-ai-audio/id6479373050
transcription:
Hello and welcome.
My name is Thomas Domville, also known as AnonyMouse.
I'm going to introduce you to an app called 11 Labs Reader, and it's spelled just as it sounds like.
So 11 Labs is one word, E-L-E-V-E-N-L-A-B-S, 11 Labs, space, reader.
This is an amazing app that you definitely want to check out if there's something that interests you, of course.
So I'm going to do a nice little review and walk through and show you a demonstration how to use 11 Labs Reader.
Now before I do that, 11 Labs should sound familiar for most of you out there.
If you haven't heard of 11 Labs, no problem.
11 Labs has been around for some time now, and what they're known for is being able to take any text and convert it into audio for you to listen.
It's incredible technology.
It sounds marvelous, and it's trying to do its best to sound as natural as possible.
And I personally think they're getting really close to that moment where we will be able to say, wow, this is definitely a hit on their hands.
And I think this app is no doubt going to be something that's going to be popular by some of you out there.
Now I have used other apps.
I won't mention those names.
They will take various files like text files or probably PDFs or EPUBs and try to read it out using whatever voice over voices that we have now.
And as you know, well, for myself, I'm not a big fan of those things.
And it's really kind of hard to read books or listen to articles with those voices, especially when I come back in a background where I'm not used to using older voices such as eloquence and such things like that.
It just sounds very unnatural, very robotic.
So I'm really more into more natural sounding speech.
So this is no doubt one of the big app that I definitely going to keep on my main home screen from now on.
Now it's saying that there are some quirks and issues with this app.
Yeah, for the most part, it is accessible and usable.
Now there are some things that if you need to do, it can be difficult.
So I won't be pointing that out.
But I'm hoping that there's definitely going to be some update to this app and to making things more efficient for voice over users.
Overall, I think this app definitely has potential and it definitely is going to be something I'm going to be using a lot.
So let's go ahead and…
In this episode, Thomas Domville discusses the workings of the Emergency SOS feature on iOS. This feature is designed to be a swift and straightforward method for summoning assistance and notifying emergency contacts. Upon activation, it dials the nearest emergency service and transmits your whereabouts. For iPhone 14 models and newer, it’s capable of establishing a connection with satellite services in the absence of cellular or Wi-Fi signals. Following an emergency call, your device will automatically send a message to your designated emergency contacts providing them with your location, and will continue to update them should you move.
transcript:Disclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome.
My name is Thomas Domville, also known as AnonyMouse.
And this feature I'm going to be covering is something that I hope you will never have to use ever.
However, it is also one of those features that you should know about, know how to use it, and get prepared for it if you ever have to do that.
This feature is called the Emergency SOS.
Now, for those that have been using for iOS for some time, you might be surprised how things have changed within that settings for Emergency SOS.
So this is probably a good time for you to go back and look into it and make sure all the settings are set the way you want it when you need it.
So let's dive into Emergency SOS, show you what the feature looks like, see what kind of setups you can, and settings that you can tweak, and then on top of that, knowing how to use it.
There's quite a few things in here, so let's go ahead and dive into the Settings app here.
Settings, double tap to open.
So we're gonna do just that, one finger, double tap to open up Settings.
Settings.
And then from here, just swipe to the right until you get to Emergency SOS.
Emergency SOS, button.
And we have arrived here.
So let's go ahead and open this up with one finger, double tap.
Settings, back, Settings, back button.
And now the focus is going to be at the top left-hand corner where you have your back button that takes you back to the settings.
Let me go right here a few times, and we'll get to our first little item that you are able to initiate Emergency SOS manually.
So there are two options to do this manually, and then I'll show you a couple of other options that can be done by automatic and when those would occur.
So as I swipe to the right.
Emergency SOS, heading.
Animated image depicting the buttons on the iPhone required to call emergency services.
Image, press and hold the side button and either volume button to make an emergency call.
In certain regions, you may need to specify an emergency service to dial.
Auto call requires a SIM card.
Call with hold and release.
Switch button on, double tap to toggle setting.
If you continuously hold the side button and either volume button, a countdown begins and an alarm sounds.
After the countdown, if you release the buttons, iPhone will call emergency services.
So here's our first option, and this should be set on.
If it's not, this is something you…
The podcast currently has 829 episodes available.
16 Listeners
22 Listeners
11 Listeners
34 Listeners
11 Listeners
4 Listeners
47 Listeners
64 Listeners
8 Listeners
1 Listeners
14 Listeners
3 Listeners
24 Listeners
14 Listeners
7 Listeners