Home page
ع
Images of participants in the workshop signing Egyptian Sign Language while watching the instructor, overlayed on a presentation background of the Arabic alphabet in sign language

AUC's New Sign Language Workshop

Celeste Abourjeili January 20, 2025
Civic Engagement

Did you know that sign language, like spoken language, is not universal? Egyptian Sign Language, the most commonly used type of sign language in Egypt, differs from other international sign languages. 

The universality of sign language is one of the misconceptions that a new workshop offered by the School of Continuing Education, Basics of Egyptian Sign Language, aims to address. The workshop equips participants with basic sign language skills, enabling them to communicate effectively with people who are deaf or hard of hearing, promoting inclusivity and accessibility.

Maged Habashy, director of the Languages Department at SCE, said, “This workshop can benefit anyone — AUC faculty, educators, healthcare professionals, customer service agents and more.” The workshop is currently training 180 customer service agents on sign language in collaboration with a French company running the Cairo Light Rail Transit between Cairo and the New Administrative Capital.

“The sign language workshop at AUC was a remarkable experience for me. It added a lot to my understanding of deaf culture and the needs of the deaf in communications.”

Offered in AUC Tahrir Square, the workshop kicks off by training participants to sign the Arabic alphabet and greetings, as well as vocabulary relating to family, professions, religion, governorates and countries. Participants also learn to sign adjectives that express different emotions, allowing them to maintain conversations with people who are deaf using complete sentences.

“The sign language workshop at AUC was a remarkable experience for me. It added a lot to my understanding of deaf culture and the needs of the deaf in communications,” said participant Wafaa Shoukry.

In addition to linguistic skills, the workshop highlights the culture of people who are deaf and addresses several misconceptions, including the notion that they are intellectually challenged or unable to participate in society. The first part of the workshop is dedicated to correcting such misunderstandings and exploring the need for sign language skills in the Egyptian community.

As interest grows, SCE plans to offer an advanced Egyptian Sign Language course in the future. Shoukry, who is excited to build on her skills in future sessions, said, “I highly recommend this workshop to anyone interested in learning sign language.”

The workshop is available on demand on the SCE website.

Related stories

visual impaired person guided by someone
October 20, 2025

Tips on Navigating Inclusivity

Debunking myths, Farid shared firsthand tips on the do’s and don’ts of engaging with people with visual impairments:Face to FaceDon’t assume we need help. Ask first.If we do need help, it will only relate to visual impairment. We don’t need help with everything.Communicate with us on how we would like to be assisted so there is mutual respect for each other’s space. Don’t ask someone to lead us out of a place. Speak to us directly on how we would like to be guided.Offering help is thoughtful and appreciated, but don’t go out of your way to accompany us wherever we go. Sometimes giving directions is all we need.Don’t treat us differently than people around us. If we’re standing with a group, there is no need to grab a chair for us or get us coffee. Don’t give us front-row seating. These are nice gestures from well-intentioned people, but they make us feel uncomfortable.You don’t need to raise your voice to speak with us. We only lost our eyesight; our other senses are not affected.If two people, including a person with a visual impairment, are talking to you or asking a question, address them both. Don’t just speak to the sighted person. We can sense it.The tools we use are not toys for others to fool around with. My cane is indispensable for my safety and existence.Avoid grabbing or pulling someone’s cane, even if you’re trying to help that person move around. It’s dangerous and invasive. You can ask to hold one end of the cane while we hold the other.Don’t scream “Be careful” or honk at us when crossing the street. It distracts rather than alerts us.
Student Experience
Zaydan family receiving degrees at graduation
September 22, 2025

'AUC Saw Beyond Labels'

“I want to take this moment to express my deepest gratitude for welcoming Omar and me and for creating an environment where Omar can thrive not despite being autistic, but through it,” said Hala Ibrahim Zaydan, parent of Omar Sharaf El Din ’25, as she accepted an honorary Doctorate of Humane Letters at the June 2025 commencement ceremony, during which Sharaf El Din graduated. Though the family originally worried about Sharaf El Din’s adjustment to university life, with the help of AUC’s accommodations, Sharaf El Din graduated with a Bachelor of Science in mathematics. His mother was awarded an honorary doctorate for her dedication to her son’s studies and commitment to promoting inclusion at AUC.“[Zaydan] worked closely with AUC faculty, administrators and the Center for Student Wellbeing to strengthen support systems for students with disabilities,” reads her honorary degree citation. “Her efforts have not only guided her son’s academic success but also extended to assisting other students on campus, reflecting her commitment to empathy, collaboration and community. Her contributions exemplify the values of accessibility, integrity and inclusion that AUC upholds.”"Her contributions exemplify the values of accessibility, integrity and inclusion that AUC upholds."With the help of AUC’s accessibility initiatives, Zaydan was able to accompany Sharaf El Din to classes as his aide. As his mother, Zaydan knows her son’s specific needs and has been shadowing him since Grade 5, ensuring he received a quality education. She left her profession as a dermatologist to help Sharaf El Din in school and advocate for disability support.With inclusion as one of its core tenants, AUC ensured both Sharaf El Din and Zaydan were thoroughly supported. The Center for Student Wellbeing worked with the family to provide standard accommodations for his course load, note-taking and exams. “AUC believed in him and saw beyond labels,” said Zaydan. “The University was incredibly flexible and brought out the best in Omar without unnecessary pressure. The mathematics faculty were also very supportive and made every effort to respond effectively to Omar’s needs. They are providing guidance and support to help him find a suitable job.”“AUC believed in him and saw beyond labels."For Zaydan, learning math alongside her son was a departure from the material she was used to studying, but she was dedicated to providing Sharaf El Din with total support. She appreciated the English classes required of AUC as well as the liberal arts curriculum. Zaydan’s favorite class was ancient Egyptian studies, to which she joked, “Maybe I’ll do a master’s next!”
University News
Science Of Signs
March 31, 2020

Science Of Signs

By Yakin OuederniCommunicating in Arabic Sign Language with the tap of a phoneBaher Moursy '18, Heba Sakr '18, Youssef Khairalla '18 and Tamara Nagui '17 have always been driven by a desire to give back to their communities whenever the opportunity arises. So when it came time for them to decide on a graduation project in 2017, it was difficult to settle on one of the many ideas they thought of, but they were sure of one thing: Whatever it was, it needed to go beyond being just a thesis project.After months of brainstorming, they came up with Eshara, a mobile application that translates Arabic Sign Language (ASL) into written Arabic in real time. Eshara works on any device with a camera, using video recognition technology to detect the hand movements of someone speaking ASL and simultaneously provide a written translation."Imagine a world in which a student can raise his or her hand in a classroom and sign what he or she is trying to communicate, and everyone else in the classroom reading or hearing the translation instantly," Khairalla said.The Eshara logo was designed by Yasmine Nagui '16Due to rapid advancements in artificial intelligence (AI), the world that Khairalla speaks of can soon become a reality. Eshara relies on computer vision and machine learning, a subset of AI that uses systems to identify patterns and make decisions with little to no human intervention. Both fields are increasingly being used across a variety of areas such as health care monitoring, financial services and transportation. In Eshara's case, the group films people speaking ASL and then programs the software to recognize specific movements as words so that it may provide a written translation."Basically, we design and train several AI computing models to associate each ASL word gesture with its corresponding text," said Mohamed Moustafa, associate professor in the Department of Computer Science and Engineering and the team's supervisor. "This work of action recognition is actually part of a bigger research topic known as human-computer interaction. Its applications are all futuristic."Noting the great impact of machine learning on people's lives, Moustafa emphasized the importance of investing time into developing this field. "Self-driving vehicles are expected to be fully autonomous in a handful of years, bidirectional language translation is helping real-time conversations, and mature models are being developed to recommend your next online purchase," he said.And it's exactly this type of impact that the team hopes to make with Eshara: creating programs that make people's lives easier by using AI to help underserved populations and inspire others to do the same. "We have the knowledge; we have the technology," Moursy said. "Why don't we use it for the good of society?""Be the Future" in Arabic Sign Language by (front row) Youssef Khairalla, Heba Sakr, Baher Moursy and (back) Amr AbdelGhaniTwo years after presenting their idea to the thesis panel, the team members are still developing the app. Amr AbdelGhani '19 has since joined the team.What the team members have now is the demo they created for the graduation project, which is capable of translating individual words and not full sentences. By the end of the year, they hope to accomplish two goals for the app: translating full sentences and being quick enough to produce words in real time. Development takes place in two stages: data collection then programming the machine.Eshara app designWhen the team members initially started conducting research, they were surprised to see that there was no project like theirs targeting ASL speakers in the Middle East. This made the data collection process much more difficult and time consuming. "It took us around a year to collect data for the demo," Moursy said.One year for 16 words.Data collection involves choosing words to include in the Eshara dictionary and then filming those words being spoken in ASL. To ensure Eshara's accuracy at all times of the day and in different locations, the team needed to film movements across a wide array of environments and with different people. With no ASL experience and no contacts with anyone who speaks it, the team members had to learn the movements and train their friends to shoot the videos."When filming the words, we needed a wide range of skin colors, hand sizes, backgrounds and lighting, and we had to film at different times of the day," Sakr said.Their project stalled for one year when they lost funding, but AUC secured funding earlier in 2019, and the team got right back to work in July. Since then, they have expanded the dictionary to between 800 and 1,000 words. "Being part of the Eshara team resembles a great opportunity for me to help create something that can directly impact people's lives and has the potential to revolutionize communication with those who cannot hear," said AbdelGhani. "The current progress is unprecedented for Arabic Sign Language recognition, and I believe that we could potentially push this technology to achieve a breakthrough in scalable automatic ASL recognition. The passion and excitement of the team along with AUC's support make this project a fun and fulfilling journey."Data Collection for EsharaTurning AUC Tahrir Square into their workspace, the team members meet with ASL professionals and communities in Cairo that cannot hear or speak to film vocabulary. Khairalla, who is currently overseas, is helping with researching quicker and more efficient technologies for the app."Speed is our main concern," Moursy said. "The app needs to be able to translate full sentences in real time."Sign language translating programs do exist for languages other than ASL, but what differentiates Eshara from all others is its accessibility. Most programs use censored gloves or 3D cameras to detect hand movements, while Eshara can be downloaded on mobile phones and works across different software."If someone who speaks Arabic Sign Language is sitting right next to me, there would be no way of communicating with him or her," Moursy said. "Eshara would allow me to just pull out my phone and carry out a conversation. We're not realizing that there's a whole demographic of people we aren't talking to."What started out as a graduation project turned into an initiative to include an often neglected demographic in society. Even AUC Tahrir Square, which started off as a meeting place for data collection, became a space for a community of people working toward a common cause."The ASL professionals were really excited about the project and even want to continue helping us for free," Sakr said, emphasizing how this project will allow ASL speakers to enter the workforce more easily, making way for diverse skills and talents that were previously not tapped into. "We're helping to create a better future for them because they can finally be able to take part in regular, everyday life like everyone else."
Civic Engagement
Mind Over Matter
February 26, 2025

Mind Over Matter

Recovering vision to the blind, restoring motor function to people with disabilities -- what's the science behind it all? Seif Eldawlatly, associate professor of computer science and engineering, is unlocking new possibilities for people with disabilities and others through his research in the futuristic field of brain-computer interfaces (BCIs).Now, after 18 years of work in the field, Eldawlatly is helping AUC develop its expertise in BCIs and exploring opportunities to improve everyone's lives. "This field is relatively new, and not many people in Egypt or the region actually work in this research area," he says.Associate Professor Seif Eldawlatly is introducing the cutting-edge field of brain-computer interfaces to AUCTechnological TrajectoryEldawlatly's work has evolved alongside the technology. Visual prosthetics work by recreating signals through the brain via implanted electrodes. "Our eyes are almost like a camera taking a picture of what we see, but the actual vision and perception take place in the brain. We understand what we see through the process in our brains," he says. So when the eye malfunctions or stops working due to disease, Eldawlatly says it is possible to control the brain itself by artificially providing the input that was supposed to come from the eye.A chip attached to wires acting as 30-some electrodes could be implanted in the brain, receiving inputs from an external camera attached to glasses. "That's what the field of visual prosthetics is: We try to use artificial intelligence (AI) to deliver electrical pulses to certain locations in the brain related to vision. If we send the right signals to these locations, people could see again, at least partially," he says.In his research to supplant motor disabilities, Eldawlatly has mostly used electroencephalogram (EEG) recording headsets, a non-invasive technology that records brain activity through electrical signals picked up by small sensors attached to the scalp.A subject in Eldawlatly's study wears an EEG headset, photo courtesy of Seif EldawlatlyIn one of Eldawlatly's experiments, people with disabilities were able to write words by selecting letters separated by flickering boxes of varying paces, each of which would elicit a different electrical pulse in the brain.Once the EEG headset picked up the electrical signals using AI, the computer would produce the correct letter, allowing the person to write using only their brain. In an adjacent project, the EEG detected a spike when the desired character was displayed, providing an alternative system to the flickering boxes. Patients were able to move their wheelchairs using the same technology.BCI can also be used to enhance daily life for those without disabilities or diseases. Eldawlatly shared the example of an AUC senior project he supervised that was conducted in collaboration with Siemens, enabling emergency braking in vehicles based on brain signals."When an emergency-braking situation happens, such as a car in front of us suddenly stopping, many people panic, for say 100 milliseconds, but their brain detects that they need to stop the car even if they don't press the brakes. They hesitate for a moment, and becauseof that, an accident might happen," he explains. By detecting the brain pattern corresponding to emergency braking, the car can be stopped, avoiding the accident.Additionally, Eldawlatly has been developing AI techniques to diagnose the neurodegenerative disease Amyotrophic Lateral Sclerosis (ALS), a fatal illness that is usually diagnosed in the later stages. With machine learning algorithms, Eldawlatly's team is working on identifying abnormal patterns in signals from the spinal cord. "If we can diagnose the disease early on, we can start administering drugs to slow its progress, elongating the life of the patient and preventing them from losing all function," he says.Field of the FutureMuch of Eldawlatly's work may sound aspirational and futuristic, far off from the world we live in now. But BCI is already being used around the world, and there are companies already performing chip implantations in the brains of paralyzed patients seeking mobility, or a simulation of mobility. And while Eldawlatly has not worked on invasive procedures outside of animal testing, he believes that the future lies in both invasive BCI, implemented through the surgical insertion of electrodes in the brain, and noninvasive BCI, through external apparatuses such as EEGs.This is why he emphasizes the need for strict ethical guidelines around BCI practices. Regarding invasive techniques, Eldawlatly says, "All the work being done in this field of research has to follow strict ethical guidelines. Otherwise, if it falls into the wrong hands, the technology might cause issues. However, the good news is that patients have to undergo surgery first, so they have to agree to it.""We deliver electrical pulses to certain locations in the brain related to vision. If we send the right signals to these locations, people can see again, at least partially."He added that, whether surgically invasive or not, brain monitoring raises concerns of privacy, so ethics are always a priority. "In both cases, we're getting information about what the brain is trying to do, and the brain, not our face or fingerprint, is the true representation of our identity. So the data should not be used for anything that the subject does not approve," Eldawlatly says.Though he works in data analysis and not in hardware development, Eldawlatly says that the technology can soon become accessible, with EEG headsets already available at varying price ranges. "Once the industry turns the research into a product, it becomes a reality. It becomes something that everyone is using," he says, comparing BCI to AI, which was not widely known until ChatGPT became publicly accessible -- even though researchers had been developing AI technology for the past century. "Now everyone is using AI, so the same thing might happen in brain-computer interfaces," he says.
Science and Tech
Share