- Thank you for joining us today. My name is Darlene McLennan and I'm the manager of the Australian Disability Clearinghouse on Education and Training, ADCET for short. This webinar is being live captioned and Auslan interpreted. To activate the closed captioning, click on the CC button in the toolbar that is located on either the top or the bottom of your screen. We also have captions available via the browser and the URL for that is now being put into the chat pod. If you wish to change the size on the Powerpoint that you are seeing and the interpreter, you can click on the side bar in between the two images or in between the video and the Powerpoint image and you'll see some lines which will allow you to resize them. That's the first bit of the housekeeping. Now I'd like to acknowledge that I'm coming from Lutruwita, Tasmanian Aboriginal land, sea and waterways. I want to acknowledge with deep respect the traditional owners of the land, the Palawa people. I stand for a future that profoundly respects and acknowledges Aboriginal perspective, culture, language and history and the continued effort to fight for Aboriginal justice and rights that pave the way for a strong future. I would also like to acknowledge the traditional custodians of the various lands on which we work today and the Aboriginal and Torres Strait Islander people participating in this webinar. In July, ADCET undertook a survey of disability practitioners from the tertiary sector to understand the impact of COVID on their practice and on the students that they support, also on the teaching and learning. We had 18 recommendations that have been put into the report and these are going to guide our activities going forward. In the survey, a number of participants identified the challenges and struggles that many students who are deaf or hard of hearing were experiencing with the rapid move to online learning. We reached out to three of our fantastic, wonderful, brilliant colleagues, Cathy Easte, Gary Kerridge, and Bobbie Blackson and asked them to share their knowledge and experience with us. Before I ask Gary to introduce the session, just another few housekeeping details. The session will be recorded and the recording will be available on ADCET in the coming days. If you have any technical difficulties you can email davids@stepsgroup.com.au. David has probably just put that into the chat now. Please contact him if you have any questions or concerns. The presentation will run for around 50 or so minutes. Our presenters have mentioned that they can be challenged by time keeping, we will try to keep them on time. At the end, we will be able to ask some questions so throughout the presentation, if you have a question you would like answered at the end of the presentation, please put that into the Q&A pod, not the chat pod. We're happy for people to chat throughout the session to each other and to us in the chat pod, and if you want to do that, please choose “all panelists and attendees”. But for the questions at the end, put that into the Q&A pod. I hope that makes sense. It does confuse people sometimes. Over to you, Gary, and thank you all for joining us and thank you for the great presenters. - Hello everybody, I hope you can hear me okay. It's my pleasure to be here today. I'm particularly pleased to be working with Cathy and Bobbie, we go back a really long way, way back to 1985 when I met them both. The three of us have seen many, many changes over the time with the introduction of the National Relay Service, email, SMS, captioning, video relaying interpreting. We three have been at the forefront of a lot of these changes. Now, with COVID, when that happened, that took us all by surprise. All of the other things that we had to do, we had time to change and advocate and push or whatever, but when COVID hit us and education went online, we just had to respond really, really quickly. I am actually in awe of Cathy and Bobbie being at the coalface and they've had to make so many changes, meet so many challenges. We'll just put up the slide for challenges now. I'll read some of them out just in case we have people watching this who have vision impairment. But some of the challenges we have were to provide equal access to the platforms, and there are many, many platforms. Today, we will focus mainly on four, Echo, Blackboard, Zoom and Teams, but we'll touch on many other issues, using YouTube videos and all of that sort of stuff. One of the biggest challenges was to ensure inclusion because a lot of people think just to provide the captioning or the interpreter is all that's needed, but as we found when we started to go on to these platforms, there are many, many different things. I won't talk about them because that will be the job of Cathy and Bobbie. When this all happened, there was a lot of anxiety. A lot of anxiety for students, a lot of anxiety from the teachers, a lot of anxiety from everybody. We had to deal with that anxiety. Of course, with any technology, we have limitations. The internet can drop out and so on. It also depended on the experience of students and staff with technology, because students and staff, they all have varying levels of knowledge. I, myself, I am okay. Other people have no idea how to use Zoom and I know myself I had contact from many students who were saying, "I'm stuck, I don't know how to use the captioning, I don’t know where to book the captioning,” and so on.  Everybody is different. So, the language levels of deaf and hard of hearing isn't as well. Some of them like automatic captioning, some of them like real time captioning, some of them need Auslan interpreting, so everybody is different. The institutions themselves have had to provide more captioning because we're teaching all sorts of information, it's very different from being there in the classroom. For some students who are actually in the class, they didn't need captions, but when they went online, they did. That's just a few of them. Can we go on to the next slide? The needs of students vary as well. We have access to live classes, captioning, interpreting or both. We have access to recordings in classes to review. We have access to all additional study materials, including videos and podcasts, access to notetaking and transcripts, access to the use of technology, they need training in the use of that technology. They needed help with English grammar. They needed access to staff and tutorials and clarifications and the needs change from class to class and teacher to teacher, so there were many, many challenges and many, many different needs. It's not my job to expand on these. I'm going to give that to Cathy and Bobbie who have been at the coalface. I'd just like you to think about this. In this webinar at the moment, we have people who are necessarily students, not necessarily teachers. We have people who are just interested and they've had to go online for team meetings or training or whatever. You'll find with this presentation, there might be a lot of overlap with your own experience so I hope you find it useful. I'll hand you over to Cathy. - Time for interpreter swap. - Hi, thanks Gary, appreciate that very much. Yes, we go back a long way. I probably, if I'm rude enough, I'll claim Gary as a Griffith person because we met in a … degree, which is part of a Griffith, so yeah, we've got three Griffith people here. That says a lot. I met Bobbie way back then as well. Gary has shared with you some of those challenges and needs of students and it became very clear, especially with COVID, that online is no longer anything like the past distance learning we have experienced. We were learning capabilities of new technology to mirror what we used to do rather than find new ways to do things. I think in the rush and the scramble, that's what we were doing. Now, months into the change, we are perhaps pausing to reflect upon what we do, why we do it and how we can do it better. It's a chance to think about some of the long-standing practices we have within our institutes, and perhaps have less usefulness, like large lecture theatres, I’m hearing they’re on the way out. What can we do now that we perhaps couldn't do before? It's interesting we've got deaf people around the nation online at the moment presenting and interpreters in different locations, but we can still do that. It's perhaps we couldn't do maybe even 12 months ago. We've moved a lot. Before I go too much further, I want to take a little bit of time and ask you to stop and think for a minute. If you can hear me, perhaps close your eyes. I want you to imagine, imagine being denied access to audio every time, everywhere, every day, in everything. All the time. No audio. Nothing. Deaf people who are requesting access to captions or interpreting are the equivalent to a hearing person requesting audio. It's no different. I want to stress from the start that there is no one-size-fits-all. Gary, Bobbie and I are all very different deaf people, but we're all deaf. Okay? If you compared our audiograms, you might not find a lot of difference between our audiograms, but our experiences of deaf people have made us who we are. Hopefully, today, we get to share a little bit about that and you'll have a better idea about how to help shape tomorrow's successes. We're sharing not only from our own practice, but from our own experience. Sometimes, you'll have to trial and error on approaches and find what works and you'll need to review. On the screen, you have a list of technology options. It’s, in part, a picture of what was on my whiteboard. I have a large whiteboard in my office and when COVID-19 hit, Bobbie was coming into my office every ten minutes I think at one point, listing something new that a teacher was doing. This teacher's only going to YouTube videos, this teacher is only going to use Vimeo videos. This teacher is only going to use Echo 360. This one’s going to shove up all of last year's recordings. This one is only going to use ABC or SBS videos. This one's using ABC radio podcasts. This gives you a massive idea of all the different types of audiovisual material that is available for teaching today. There is a huge variety and selection of choices that people can make. That makes it a little bit difficult because you can't just learn one system, like we could interpret in front of a class and you have access to an interpreter. That's simple. We know where to find interpreters, they come in, we pay for their parking. Once you go online, that becomes a lot more difficult. Maybe the interpreter doesn't have access to a good computer system set up at home as well. It's not just the students we have to think about, but our service providers as well. That was certainly a challenge and that is just a sample. But the other thing I also want you to remember, I want you to think about at the moment is when a deaf or hard of hearing person is watching the screen or watching captions all the time, whether they're watching captions or watching an interpreter, they can't look away to take notes. Yes, you can provide captions. Yes, you can provide an interpreter. But you're not providing access to the whole learning experience unless you're also thinking that element of access as well. You're not just accessing what you're saying, but enabling that person to behave as a student as well. That means to be able to take notes. That means to be able to be a participant in the class. Next slide. The next slide I'm talking about, and I want to get this out quite early, is the free auto captioning apps that are available. I'm not out to promote auto captioning, but I want to tell you that auto caption is an option for some things. It's not an option to replace access. There's a few auto caption options that are available. MS Translator, which is a Microsoft Office translator. It's available on Apple and Android. I use it all the time because it gets me immediate access. If I get a meeting called up at the last minute or I need to talk to an academic, I have access immediately. I might not have access to an interpreter or captioning immediately. Live Scribe, Google Live Scribe is another one, but it's only available on Android. That could be an option. Otter is also reported to be really good. But you only get 600 minutes free, then you have to pay for it. As a deaf person, I'm not going to pay for something like that when I can use something for free. But if you're going to pay for your institution, that's different. Yes, auto captions are terrible and we'll give an example of how terrible they can be later. But they do have a place because they give you that immediacy of access. Some students will use auto captions, but it's important to remember they are not going to replace access. They do not replace access. Next slide, please. That's an example of auto captioning and the slides are in a slightly different order to what I have. That shows you where the auto captions can go wrong. The first picture there, as you know, it's actually taken from MS Translator and it's an actual meeting that I was in and I took screenshots of the meeting. The text is, "I won't be able to stay for the full meeting. Before we start I would like to know the Aboriginal interest rate Islander people as the traditional custodians of the lands on which rural mating and pay my respects to elders, past, present and future.”  That's not what was spoken. It's very typical, paying your respects to the traditional custodians of the land on which we meet. Auto captions have got it wrong and the issue with this is when they do get it wrong, your mind has gone somewhere else and then your mind has to come back. So, you lose time, the meeting is still going ahead. By the time you've read that, switch it to what it should be in your mind, the meeting is two minutes in front of you. So, you have a bit of catching up to do. The next one, I want to just, quickly, you can have a look at these later, but the second picture says, "That's it, some surprise that person's name has sort of got their *** in a knot".  What they're saying is they're surprised so and so had their knickers in a knot. But the captions will not type “knickers” because the knickers in a knot is considered not an appropriate thing to say. It's on the level of a swear word, so a lot of the captions will not. Auto captions will not caption swear words. So, it's not equal access, really. Next slide. Gary mentioned the platforms that we are going to talk about. The platform we're going to talk about is Zoom. We're using Zoom now. Zoom has captions. You can make the captions larger. You can't change where the captions appear on the screen. That limits some options. Zoom doesn't have an auto caption feature, but you can put real time captioning within Zoom. I'm pretty sure, I think, it records the captions also. MS Teams is what we use quite a lot here at Griffith University. It has an auto caption feature. The auto caption feature, depending on the meeting, can be extremely good. It can be a little bit incorrect, but it can be very, very good. Auto captions will record, you can edit the transcript in MS Stream, can you download the transcript. It does work quite well. Echo 360 is what a lot of people will be familiar with. They now have an ASR, automatic speech recognition feature. The ASR featuring Echo 360 I find quite good, depending on the speaker. Griffith University has turned on ASR for all of its lectures where it can. Therefore, it is available for the ASR access and interactive transcript — interactive transcript is really good because you can go forward and back, you can download the transcript, et cetera. If you load any other video into Echo, you can turn on the auto captioning. We turn on the auto captioning for anything we load into Echo 360 for any deaf or hard of hearing students. That doesn't mean we don’t do closed captioning as well. We do closed captioning as well. The closed captioning can take four days to come back sometimes. The ASR gives you immediacy of access and then you can access the closed captions later, should that be required as well. Blackboard Collaborate issues quite a bit. There's a spelling error there in Collaborate, apologies for that. Blackboard Collaborate doesn’t have an auto caption feature. You can put real time closed captioning into Blackboard Collaborate. They have to be entered into Collaborate. If you use a separate system in terms of the set up with Collaborate, it just kind of doesn't work. We actually, yesterday, had a captioner refused access into a classroom because they immediately went into the chat and asked the deaf person to access 1 cap and not use Blackboard Collaborate and therefore our IT thought that was a scammer. Someone gate crashed and they denied them access to the class. We've had to scramble to get the recording captioned later. Collaborate has a few limitations in that way, but you can enter real captions, the real captions will record. We generally put the recording then into Echo ASR to generate a transcript. From my understanding from talking to our network, they're the majority of systems that are being used. There are the live class options and live video options. Echo, you can do live classes from home if you have the correct set up and so forth and good internet access also. Some of our lecturers are doing live Echo classes from home. A lot of deaf people will like an external caption set up, so it's external to the platform. That's just so they can really have a better control over the sizing, the colour of the captions, setting it to a black background with white text or white background with black text, bolding, et cetera. So, they've got more options, they can move it to wherever they want on the screen and make the size of the box however they want. You've got to remember when you're providing access with captions or interpreters in any of these platforms, it's not just about access to the content or the captions. You've got to provide access to the transcript as well, which is where a separate 1 cap app, for example, you may have a copy of the transcript, obtain a copy of the transcript. Echo 360 Live, which we do quite a bit of here at Griffith as well, then you get a copy of the transcript as well immediately at the end of the lecture. Every student gets a copy of that transcript. The transcript allows the student to take notes. The transcript also challenges the student to be what I call “engaged” in the learning process themselves. So, they're not just passive, they sit and watch a lecture and walk away. They have to do something with that transcript. You have to push with your students, they have to be engaged with the learning. The transcript and having them edit the transcripts into notes that will work for them is a really good thing to be pushing. Sometimes in terms of access when you’ve got captions, you’ve got to use two screens. Two screens can be really difficult and challenging. It's something you might have to actually sit and work with your students on, making sure they know how to change a 1 cap screen so it's just two lines at the bottom of the screen rather than a whole page of text. We've got students who can't stand to look at a whole page of text so they don't like the 1 cap app, for example. It changes with students. Next slide, please, I know I'm pushing my time here. So, the creation of notes. As I was saying, it's an important consideration. It's not just about having someone write notes and give it to the deaf student, but it’s challenging your deaf and hard of hearing students to be engaged in the learning process by making their own notes. Every hearing person makes their own notes and they're engaged in that process through that. They can choose not to and not be engaged and a deaf person can choose not to and not be engaged. A huge part of the learning is actually learning to take notes. When you're writing down what it is you understand or making notes in some shape or form so you understand what is happening, repeating what you know, then you're learning more. Your success rate goes up. It's not just access to platforms we have to look at, but access to technology. So, the access to WiFi, access to… do they have two screens, do they have a phone that they can watch the captions on and watch the class on?  How much video do they need?  So, you need to think about their technology access. Also, the other students in the class, what do they need to know about the learning needs of your deaf and hard of hearing students as well?  So, everybody is interacting equally. Sometimes, if it's a very interactive class, if it's a tutorial type class, you might have to teach the other students that they have to use chat a little bit more rather than just all talk over the top of each other. It's a bit of training for the other students so everybody's learning as well. The number of classes in a day and the energy levels of your student. This is the fourth meeting I've had this morning today already so, therefore, my energy levels are not as grand as they were if this was my first meeting in the day. I've been watching captions in different platforms and in different shapes and sizes all morning and that takes from somebody's energy levels. There's a limit sometimes in how much you can do that. They're considerations you may have to have in discussion with your students as well. Maybe they limit the number of classes that they're enrolled in so they can manage the load and their own fatigue levels. I can go home absolutely exhausted at the end of the day purely because I've had to strain to listen and watch captions all day. Timing of access is important. Do your deaf and hard of hearing students have access immediately?  Do they have access to what they need to be a student?  There was a Facebook post last week from a deaf person in Australia who is complaining that he had to wait 14 days for access to transcript from a class and that's regular. Every single class, every single week, he's waiting 14 days for access to the transcript. He was told that was considered a norm. That cannot be good enough. That cannot be good enough. Four days is the maximum anybody should have to wait for captions. They should not have to wait for… I don't think they should have to wait four days, but timing of access is important. You want your students to compete on the same level and equally as other students. They need access to that material. That's why I say sometimes, as a matter of urgency, auto captions can be an option for some deaf and hard of hearing students. Listening fatigue is also very important. Very tiring. The other one thing I want to quickly, I know I'm out of time, quickly say, in all learning and all learning of new material, one thing for some deaf or hard of hearing people that makes all learning really, really difficult is how to pronounce new terminology. When I lost my hearing, I woke up one morning and I had zero hearing. I lost all my hearing. I was doing my Year 12 study at that point in time and I would go to classes and I'd go to classes and I’d go… previously, I was the top of my class and now I can't even read my textbook, I couldn't read my textbook because the words in the textbook in biology, this thing makes this happen to get this thing… I'm going, I couldn't say these words. Whereas before, if I couldn't say them, I couldn't read the book. Before you'd sit in class, I'd hear the teacher say a word, I'd look, oh, that's what she's saying. You would associate it. When I couldn't do that, it really affected my learning. It affected my ability to read. It affected my ability… I felt like I'd suddenly become stupid. I'd only lost my hearing, I couldn't read and learn any more. How you learn as an individual is very important. This is not true for all deaf and hard of hearing people, but it's true for some deaf and hard of hearing people. If they can't say the words, they find it very difficult to learn. Pronunciation and being able to say those words is very important. I was fortunate that I talked to my biology teacher and I said, "I've just become dumb as well, I might as well quit.”  She said, “No,” we opened the textbook. I said, “I don't understand this.” She forced me to read it aloud and that's when I realised, oh, and we made that association that it was because I couldn't say those words. So, she would write out all the pronunciation for me so then I could read those words. Sometimes, it's the little things and you've got to get to talk to your students about what it is that affects their learning. I'm going to hand over to Bobbie and Bobbie's going to talk about Auslan interpreting on online platforms. - Interpreter change over. - Hello. I’m a deaf Auslan user and I'm presenting today. I have been interpreting here for 13 years at Griffith University. I've been working at interpreter coordinator services and supporting deaf students, supporting interpreters and interviewing interpreters and students and finding the right match, and sometimes if there are issues for students, I will listen to their issues and help them solve their problems. Also, I am helping students to develop the appropriate skills, especially students that have arrived from secondary college into the university setting, to help them with their study skills and negotiation skills and if they're having difficulties with the academic language or the interpreters that are required and helping them to adjust to the real world of university learning, and that will also help them to adapt to how to use interpreters in the real world of employment. So, COVID-19, as we all know, has been a significant impact and we've had to make some very big transitions and changes and it's been a big steep learning curve for us as teachers as well. The approach to online learning platforms, Auslan itself is a beautiful language. It's a three-dimensional language. Because of this, the nature of the three-dimensional language of Auslan can really be impacted because when we're looking at a two-dimensional screen, we need to adapt the way we're communicating and signing so it's a lot more visible to the audience on an online platform screen. We need to make adjustments when we're signing to make sure our finger spelling is clear, our signing is clear and the way that we're communicating through Auslan is clearer. When we're interpreting, when interpreters interpret, and there might be a simultaneous interpretation process, for example, if a hearing person is singing the Australian anthem, then that is considered to be a prose and text. The interpreter would be able to interpret that song simultaneously to the way it's being sung. But more commonly, there will be a consecutive interpreting process, which means that the speaker will speak the messages and the interpreter will go through a time of processing cognitively in order to translate that message into Auslan. You might have seen a lot of interpreters on the television now because of the emergency services, the bushfire recovery, bushfires, emergency notifications or through the COVID-19 event. So, the interpreter hasn't yet, you might notice, starts signing. They would wait until the message is spoken, then they will process the message and then they will, a few moments after the speaker has spoken, start signing. So, there is a little bit of a lag time and the interpreter is a little bit behind in producing those Auslan signs, interpreted messages. Also, when interpreters are operating on the television, there is less finger spelling and more bigger space signing, like in a theatre. They use more space in the signing to convey the message. Next slide, please. In terms of finger spelling, sometimes, when a message is finger spelt and it's misspelt, it can impact on the student's ability to understand the context or the message and comprehend the message. Sometimes, in a real life situation to be able to ask for clarification is a bit easier than online because that interaction is asking for clarification for the interpreter and the speaker, sometimes there is a speaker that is hidden. So, online, it's a little bit more difficult for an interpreter to see everybody who is present at that event. So, that's a bit of a challenge. A confidential chat can be used — a private chat, sorry, can be used between the two interpreters or the interpreter and the student in order to raise any concerns about the interpreting process. Usually, the interpreters don't share their private information like a phone number with the student. They will use the chat. At Griffith University, we have managed turn-taking between interpreters and staff which means that the staff interpreters have each other's Griffith University email addresses, so they're able to communicate through their Griffith University email address with the students in order to converse with the student. This has been very valuable for the students and the interpreters so they can access information like class resources and do the reading prior to the class. So, that is the way in which they are able to utilise their Griffith University email address as a staff interpreter to access Zoom also and have private discussions with the student or have discussions in general with the student. That's been very valuable. It's really important for the interpreters and the students to be able to see each other clearly. For the communication exchange to be effective, the interpreter and the student need to be able to see each other. It's also vital that the interpreter has the preparation materials prior to the class and be able to comprehend what the content is prior. Next slide, please. When we watch Auslan or we watch subtitles or Powerpoints, when we're all watching all of these things at the same time, it's a huge amount of information to take in. As you would know, I'm assuming if you have English as your first language and if that is the case, aside from that, if we put that away, imagine that you are reading in German subtitles and you are listening to the French dialogue. So, you may be able to acquire some of that French dialogue as you're watching that play on the screen, and some of the German subtitling, but you are not able to comprehend the entire message in French and German at the same time. Your brain cannot do that. Your brain needs to make a choice between the French dialogue and the German subtitles. So, you may get some information from each. Sometimes, deaf people will watch the interpreter, acquire the message, then read the subtitles at a later time. They find that that's the best way for them to learn the content rather than looking between the interpreter interpreting the message of the class and reading the subtitles or the text. It's a lot of unpacking that needs to happen all at once. In order for the interpreter to get the transcript, it will become valuable information when they do actually — it is important for them to actually receive the information prior so that when they are watching the interpreter, the student is already familiar with the content. At Griffith University, we use Teams platform for video meetings, for teaching, et cetera. But in the classroom, everything is provided online. The lecture is provided online and the teaching and learning at Griffith is used via the Blackboard platform. We have set up another separate system called Teams and this is video interpreting so that the student and interpreter can see each other. It's not possible to have both platforms operating on the same device. So, both the interpreter and the student are able to see the Blackboard platform and look at the Powerpoints and all of the other educational resources, like the lectures, ask questions. But the Teams are set up separately in order to create that interpreting process for the student. I can show you a video or a photograph of what that actually looks like a little bit later. Next slide, please. I really think we should have extended the webinar until 2:30. So, the screen has frozen. When the screen freezes during a class, if you imagine the student is already actually behind, the academic may have been trained for years in this department and they are quite good at summarising a body of information and a body of knowledge within an 18-week course. The interpreter has to work hard to make sure that all of this information is interpreted for the student and these are provided. So, the screen has to be clear. You have to have a very good internet connection. There may be other micro distractions that may be a part of the reason why it's difficult for the student to focus. And the last one is tips. Next slide, please. Auslan interpreting tips for online platforms. Run a test with the student, interpreter and academic beforehand to make sure that you can see each other and that the lighting and positioning on the screen is adequate. Sometimes, you have to negotiate with some of the students to sit back a little bit so that they are not completely filling the screen with their face. Sort out how the camera might be operating and mostly, this is because of interpreters that the transcript will need to be received weekly and it's important for the interpreter to read the preparation of the course. Especially, if there's a replacement interpreter, it's important that the coordinator make sure that the interpreter that is replacing the original interpreter has been given the course information for that week. To set up a text backup, a chat function In Teams or a Zoom meeting. To book interpreters, the same interpreters for the duration of that 18-week semester to create some normality, the student can become used to the interpreter's style and for it to have a consistent approach to the interpreting for the entire course. That is preferable over having interpreters that have just been brought in for one or two weeks and then different interpreters for other weeks. The body of knowledge that is required and builds throughout the semester is very important for there to be two consistent interpreters provided to the student. That's a summary from me. I'd like to now give the microphone back to Cathy. - Thank you very much Bobbie, that was, yeah, very comprehensive. We are pushed for time and Bobbie, Gary and I, even though we haven't let Gary talk very much, that's not very nice of us, we could talk about this for hours, but we are pushed for time so I will try and go through the rest of our information. I mentioned previously, micro distractions. That's when something is incorrect, whether it's an auto caption or you don't have access to something, you know, you’re distracted and you have to come back, you've lost that content, you've lost that information. That's a mental break that you then have to breach that mentally to get back into that content. That is very, very tiring. It makes you stop and think, “What did I just hear? What did I just learn?” It is tiring. Knowing who's speaking online is a huge issue, even though I went through that in a two and a half hour meeting this morning and I made sure that… you have to say who's speaking. It lasts two seconds and 20-odd managers and directors can't remember a simple instruction that they have to say their name before they speak, and all the captions run into each other and therefore, I don't have any idea who is speaking. This is the same in a classroom situation. When there are multiple speakers and if people don't say their name, the student doesn't know when they're reading these captions, is that content from the lecturer, so, therefore, it probably has greater importance and is more correct than this comment from a student. Which is which? If people don't say their name when speaking, you don't have time from the captions to catch the visual of who's saying what, so it becomes really, really difficult. Little things like saying your name helps to slow things down so there's something else that has to be typed. You might catch up to the content a little bit more. So, it gives you a little bit more time, and taking that pause to be able to participate maybe helps some deaf and hard of hearing to actually be involved in the discussion. We talked about connectivity issues. Bobbie talked about the pixelation. All of those sorts of things. Like, if you don't have good connectivity you've lost the captions, you've lost the interpreter. Language issues were mentioned by Gary as well, English language issues. Whether it's also new terminology that a deaf and hard of hearing person has to learn to be able to participate, to know how to say those words, to give an oral presentation, they're not going to use the terminology that everybody else uses and that's my biggest issue, I don't use a lot of new terminology, I stick to a lot of older terminology because I know how to say those things. People say, “I have very good language and speak well for a deaf person,” I tell you, it takes a hell of a lot of effort to maintain it as well. It takes a hell of a lot of effort. People get very anxious about this and your deaf and hard of hearing students are very anxious about their oral presentations and so forth because monitoring how loud you speak or don't speak takes a greater deal of energy and it's not always possible. So, it is very, very difficult. Next slide, please. I do know we've got ten minutes left and I do have to give Gary a little bit of time to say something. Tips for students. We've talked about that. Use a stand for phones or tablets. Don't have them lying down. There are some simple stands you can get. I use a stand that I can fold up, take with me wherever I go so I can put my phone in front of my computer screen, so if I need to watch the whole screen for video and Powerpoint information, because it's really, really important and I have my captions on my phone. I can hook my phone over the top of my computer screen. I can do the whole works. Picture on the next screen when we get to that. Use a separate link for captioning where possible, so long as it's not going to impact the learning of somebody else. We have classes here where we have deaf and hard of hearing people, we have students with other learning needs who need access to the captions, et cetera, as well. Sometimes, we cannot give individual students a choice. They have to have the captions within Collaborate and they can't have it separately or they have to have the captions in a certain way because there's a number of students within that class. That's the reality, but they still have access. They'll still have access to transcripts and all of those things as well. Learn how to pin interpreters in different applications where you can. Use a different link, as Bobbie mentioned, in using the interpreters. You have Collaborate, you might have the class and you use Teams for the interpreters. Whatever you can get to work that makes it easier. Minimise the distractions. Check the audio. Check your connectivity. Students need to learn how to use the technology that they're using. They also need to know who to contact. For the webinar today, we set up a WhatsApp group. If something goes wrong, we can WhatsApp and say, "My video's totally collapsed, I can't do this,” or, “The captions are totally lost, it's not working, my captioning is frozen, I can't follow what anybody is saying. Who do I contact?”  It’s the same for students. The lecturer is going to keep talking, and I can keep talking, what's the back-up plan? Students need to know what that back-up plan is. Next slide, please. This is a picture of my stand. There's my stand over there. It cost me $16. I can fold it and stick it over the top of my computer screen. I can sit it on the edge of the laptop. Visually, everything's in the same frame for me. This is what you need to think about with your students as well. Keep it simple. Next slide. I am rushing and I do apologise. Be aware of this, this is some tips for teachers and lecturers, being wary of needs beforehand. Record classes where you can so students can review. Limit the number of participants that can be seen on the screen at any one time. Establish protocols for chat, questions, discussions. Make those protocols work for the whole class, not separate protocols for deaf and hard of hearing students. Make those protocols a visual. Everyone's got to use a thumbs up for when they agree or understand. If you've got a thumbs up option in something, right. Use what works for everybody. Your protocols can be working for everybody, not just something separate for the deaf and hard of hearing students. Building in pauses where you can. Let everybody catch their breath. That will allow deaf people to participate. It's important to stay positive because I do believe there'll be some lasting positive changes that come out of everything that we've had to rush to put in place for the COVID impacts, et cetera. This will impact the practice of educational disability support provision into the future. I'm going to hand over to Gary to summarise. Any other lasting comments or questions?   -  Thank you. As we said, we're not going to be having questions because we've run out of time, but your questions will be answered and we'll put them on the website as soon as we possibly can. I thought we've got two minutes to try and summarise. As Bobbie and I said, in a signed conversation, each of us could go on this for an hour on our own individually, there are so many things we can talk about. With this sort of stuff, with COVID and going online so quickly, you can see there were many challenges. Sometimes our own interpreters changed. Sometimes the light changed. Sometimes the captioning was behind. All those sorts of things. This presentation itself was a prime example of some of the challenges that deaf and hard of hearing people face every day when they do these sort of things. The positive thing is because of technology, we now have access to this sort of thing. Five years ago, ten years ago, we wouldn't have a way of getting interpreters or captioners here. Technology improves by sharing this sort of information about the pros and cons, the tips, the technology, they all help us. I think there needs to be an ongoing conversation. I thank all of you for attending today. I enjoyed it. I actually learned a lot listening to Bobbie and Cathy. I'm absolutely in awe of the response they have done. Them and other disability practitioners and the support they've given to deaf and hard of hearing, people with a disability and the teaching staff, I think we should be very thankful they are there. Thank you. Have a good day and weekend and thank you for attending. -  Thank you, Gary, and thank you, everybody. Just to confirm, we will answer the questions, they will be posted under the video which will be on ADCET and everybody who registered today will get a link to that video as well. Thank you, everybody, thank you to the interpreters and the captioner as well.