• 404-507-6104
  • info@hqconsultinginc.com
The Next Data Revolution is  Here – AI Will Understand What Isn’t Being Said

The Next Data Revolution is Here – AI Will Understand What Isn’t Being Said

In this special guest feature, Ran Margaliot, COO and VP R&D for Affogata, discusses how nalyzing and sorting through unstructured data saves countless hours and recognizes patterns in seconds that even skilled data professionals may never uncover. That data “superpower” can lead to better products that adapt in real-time, responsive customer service,  and a sharing of insights across an organization magnifies its impact even further. The impact is well-demonstrated, and there is plenty of data and case studies to prove it; the final step is for human decision-makers to make the decision sooner rather than later.

Making Android more accessible for braille users

Making Android more accessible for braille users

Editor’s note: Today is Global Accessibility Awareness Day, and we’ll be sharing more on how we’re partnering with people with disabilitiesand what we’re doing to make education more accessible.The heart of our mission at Google is making the world’s information truly accessible. But the reality is we can only realize this mission with the help of the community. This year at I/O, we announced one more step in the right direction, thanks to feedback and help from our users: We’re making it easier for braille readers to use Android. Available in our next Android 13 Beta in a few weeks, we are beginning to build out-of-the-box support for braille displays in Talkback, our screen reader within Android.A refreshable braille display is an electro-mechanical device that creates braille patterns by raising rounded pins through holes in a flat surface. Braille-literate computer users use the braille display to touch-read braille dots representing text. With the display, you can also type out braille. These devices help people with deafblindness access mobile phones and people with blindness use their phones silently. Previously, people connected their Android devices to braille displays using the BrailleBack app, which required a separate download from the Play Store, or used a virtual keyboard within Talkback instead of a physical device.With this new update, there are no additional downloads necessary to use most braille displays. People can use braille displays to access many of the same features available with Talkback. For instance, you can use display buttons to navigate your screen and then do activities like compose an email, make a phone call, send a text message or read a book.There are also new shortcuts that make it easier to use braille displays with Talkback. Now there are shortcuts for navigating so it’s easier to scroll and move to the next character, word or line. There are also shortcuts for settings and for editing, like jumping to the end of documents or selecting, copying and pasting.You can sign up for the Android beta program to try out Talkback 13 in the next beta release.We are grateful to the community for their ongoing feedback that makes features like these possible. This is just the first step forward in developing this integration, and we can’t wait to do even more to expand the feature and to create even more related capabilities.

Earn more from your video streams through automation

Earn more from your video streams through automation

What if managing your video streaming business didn’t have to be so complex? What if your team didn’t have to dig through data across devices, apps, live streams and video on-demand to find insights?We’ve built new solutions in Ad Manager to simplify these processes and save publishers time — helping you automatically uncover new opportunities, manage all your video streams with flexibility and ultimately grow your video revenue.Automatically uncover new insightsAd Manager already provides video-specific tools and time-based metrics to help you understand the true potential of every commercial break. But making sense of your video reporting data and finding insights for your business can be a challenging, manual task.Our new Programmatic Video Health Tools save time by highlighting opportunities you may have missed, right when you log into your account. These granular insights can help you determine why some inventory performs better than others at auction.The programmatic video signals card automatically generates a snapshot of how your video inventory is performing. It shows signals that are important to advertisers, such as viewability, app or web domain name and audience information — plus their impact on revenue. These three dimensions make it easier for advertisers to value your inventory and can help you grow your revenue by identifying where these metrics can improve. Globally, publishers with high programmatic inventory signal coverage see an average 25% revenue uplift compared to inventory with low programmatic inventory signal coverage.[be2de3]We’ve heard from publishers that error reporting for lost ad requests (that can directly impact revenue) requires a lot of manual tasks across multiple video-specific reports. With the Video Ad Serving Template (VAST) errors insights card, Ad Manager uses automation to quickly show you the number of errors on your inventory and which line items are causing the errors, so you don’t have to spend time running a custom report.You can even sort the list of line items to find errors with the highest impact on revenue. By surfacing these actionable error insights early, the VAST error insights card can help you increase revenue by fixing misconfigured settings or broken creatives.Lastly, because we know every publisher has unique business needs, we’ll release a Video Performance Alerts solution to beta to help you automate insights based on specific requirements. With Video Performance Alerts, you’ll be able to create customized email alerts for your choice of campaign metrics and dimensions. For example, you can create an alert for when total impressions across a line item drops below an expected daily threshold. So instead of constantly logging in to check on campaign performance, you’ll get notified automatically.Identify and fix problems faster with new reporting toolsQuickly finding a problem will help you resolve it sooner and earn more revenue. This is especially important during traffic spikes from large audiences tuning into new episodes or live events. So we created real-time video reporting to help you quickly get the information you need.With real-time video reporting, we’ve improved historical ad serving data availability by 10x, shortening the time to access performance data to under two minutes — so you can get ad unit or line item level data to find and fix errors before the next commercial break. Keep an eye out for new solutions over the coming quarters to help you identify and solve issues in real time.For publishers serving ads to their content on YouTube with Ad Manager, we’ve launched a new troubleshooting tool to open beta. With the YouTube ads delivery tool, you can test ad delivery on YouTube inventory. It lets you see data like ad requests, key-values and details for the winning line items to help you validate and fix issues. To make sure everything is behaving and delivering as expected, you can even test line items and simulate requests.Supporting your video streaming growthWe hope these new automated solutions and faster reporting tools give you the time and space to focus on growing your business. Look out for more updates this year to help you improve troubleshooting, test your set up and find more insights.

Helping every student learn how they learn best

Helping every student learn how they learn best

Editor’s note: Today is Global Accessibility Awareness Day. We’re also sharing how we’re partnering with people with disabilitiesto build products and a newAndroid accessibility feature.I often think about what Laura Allen, a Googler who leads our accessibility and disability inclusion work and is low vision, shared with me about her experience growing up using assistive technology in school. She said: “Technology should help children learn the way they need to learn, it shouldn’t be a thing that makes them feel different in the classroom.”As someone who has spent years building technology at Google, I’ve thought a lot about how we can create the best possible experience for everyone. A big part of getting that right is building accessibility right into our products — which is especially important when it comes to technology that helps students learn. Ninety-five percent of students who have disabilities attend traditional schools, but the majority of those classrooms lack resources to support their needs. The need for accessible learning experiences only intensifies with the recent rise of blended learning environments.Teacher working with a student on a ChromebookAn educator works 1:1 with a studentA teacher sitting with a student with intellectual disabilities. The teacher’s cane is leaning on the table nearby.An educator sits with a student working on a Chromebook.One autistic student and one student with Downs Syndrome working together in classroom on a ChromebookStudents in their special education class working together in their classroomWe want students to have the tools they need to express themselves and access information in a way that works best for them. Here are a few recent ways we’ve built accessibility features directly into our education tools.You can now add alt-text in Gmail. This allows people to add context for an image, making it accessible for people using screen readers and helping them better understand exactly what is being shared.We’ve improved our Google Docs experience with braille support. With comments and highlights in braille, students reading a Google Doc will now hear start and end indications for comments and highlights alongside the rest of the text. This change makes it easier for people using screen readers and refreshable braille displays to interact with comments in documents and identify text with background colors.Video format not supportedWe added new features to dictation on Chrome OS. Now you canspeak into any text field on the Chromebook simply by clicking on the mic icon in the status area or pressing Search + d to dictate. The dictation feature can be helpful for students who have trouble writing — whether that’s because of dysgraphia, having a motor disability or something else. You can also edit using just your voice. Simply say “new line” to move the cursor to another line, “help” to see the full list of commands, or “undo” to fix any typos or mistakes.Video format not supportedAccessibility in actionWe see the helpfulness of these features when they’re in the hands of teachers and students. My team recently spoke with Tracey Green, a teacher of the Deaf and an Itinerant Educational Specialist from the Montreal Oral School for the Deaf (MOSD) in Quebec. Her job is to work with students with hearing loss who attend local schools.She and Chris Webb, who is a teacher at John Rennie High School and also a Google for Education Certified Innovator and Trainer, have been using Google Classroom to support students throughout distance learning and those who have returned to the classroom. For example, they integrate YouTube videos with automatic captioning and rely on captions in Google Meet. Their efforts to improve access to information during school assemblies kicked off a school-wide, student-led accessibility initiative to raise awareness about hearing loss and related accessibility issues.Benefiting everyoneOne phenomenon that underscores how disability-first features benefit everyone is called the “Curb-cut Effect.” When curbs were flattened to allow access for people with disabilities, it also meant greater access for bikers, skateboarders, and people pushing strollers or shopping carts. Everyone benefitted. Similarly, accessibility improvements like these recent updates to our education tools mean a better experience for everyone.We see this similar effect time and time again among our own products. Take Live Caption in the Chrome browser for example. Similar to Google Meet captions, Live Caption in Chrome captions any video and audio content on your browser, which can be especially helpful for students who are deaf or hard of hearing. It can also be helpful when people want to read content without noise so they don’t disrupt the people around them.When we build accessible products, we build for everyone. It’s one of the things I love about working for Google — that we serve the world. There’s a lot of work ahead of us to make sure our products delight all people, with and without disabilities. I’m excited and humbled by technology’s potential to help get us closer to this future.Stay up-to-date on the latest accessibility features from Google for Education.

How we build with and for people with disabilities

How we build with and for people with disabilities

Editor’s note: Today is Global Accessibility Awareness Day. We’re also sharing how we’re making education more accessibleand launching a newAndroid accessibility feature.Over the past nine years, my job has focused on building accessible products and supporting Googlers with disabilities. Along the way, I’ve been constantly reminded of how vast and diverse the disability community is, and how important it is to continue working alongside this community to build technology and solutions that are truly helpful.Before delving into some of the accessibility features our teams have been building, I want to share how we’re working to be more inclusive of people with disabilities to create more accessible tools overall.Nothing about us, without usIn the disability community, people often say “nothing about us without us.” It’s a sentiment that I find sums up what disability inclusion means. The types of barriers that people with disabilities face in society vary depending on who they are, where they live and what resources they have access to. No one’s experience is universal. That’s why it’s essential to include a wide array of people with disabilities at every stage of the development process for any of our accessibility products, initiatives or programs.We need to work to make sure our teams at Google are reflective of the people we’re building for. To do so, last year we launched our hiring site geared toward people with disabilities — including our Autism Career Program to further grow and strengthen our autistic community. Most recently, we helped launch the Neurodiversity Career Connector along with other companies to create a job portal that connects neurodiverse candidates to companies that are committed to hiring more inclusively.Beyond our internal communities, we also must partner with communities outside of Google so we can learn what is truly useful to different groups and parlay that understanding into the improvement of current products or the creation of new ones. Those partnerships have resulted in the creation of Project Relate, a communication tool for people with speech impairments, the development of a completely new TalkBack, Android’s built-in screen reader, and the improvement of Select-to-Speak, a Chromebook tool that lets you hear selected text on your screen spoken out loud.Equitable experiences for everyoneEngaging and listening to these communities — inside and outside of Google — make it possible to create tools and features like the ones we’re sharing today.The ability to add alt-text, which is a short description of an image that is read aloud by screen readers, directly to images sent through Gmail starts rolling out today. With this update, people who use screen readers will know what’s being sent to them, whether it’s a GIF celebrating the end of the week or a screenshot of an important graph.Communication tools that are inclusive of everyone are especially important as teams have shifted to fully virtual or hybrid meetings. Again, everyone experiences these changes differently. We’ve heard from some people who are deaf or hard of hearing, that this shift has made it easier to identify who is speaking — something that is often more difficult in person. But, in the case of people who use ASL, we’ve heard that it can be difficult to be in a virtual meeting and simultaneously see their interpreter and the person speaking to them.Multi-pin, a new feature in Google Meet, helps solve this. Now you can pin multiple video tiles at once, for example, the presenter’s screen and the interpreter’s screen. And like many accessibility features, the usefulness extends beyond people with disabilities. The next time someone is watching a panel and wants to pin multiple people to the screen, this feature makes that possible.We’ve also been working to make video content more accessible to those who are blind or low-vision through audio descriptions that describe verbally what is on the screen visually. All of our English language YouTube Originals content from the past year — and moving forward — will now have English audio descriptions available globally. To turn on the audio description track, at the bottom right of the video player, click on “Settings”, select “Audio track”, and choose “English descriptive”.For many people with speech impairments, being understood by the technology that powers tools like voice typing or virtual assistants can be difficult. In 2019, we started work to change that through Project Euphonia, a research initiative that works with community organizations and people with speech impairments to create more inclusive speech recognition models. Today, we’re expanding Project Euphonia’s research to include four more languages: French, Hindi, Japanese and Spanish. With this expansion, we can create even more helpful technology for more people — no matter where they are or what language they speak.I’ve learned so much in my time working in this space and among the things I’ve learned is the absolute importance of building right alongside the very people who will most use these tools in the end. We’ll continue to do that as we work to create a more inclusive and accessible world, both physically and digitally.

Emza and Alif Demonstrate Fast, Ultra-Efficient Object Detection for Tiny AI Edge Devices

Emza and Alif Demonstrate Fast, Ultra-Efficient Object Detection for Tiny AI Edge Devices

Emza Visual Sense, a pioneer in Tiny AI visual sensing, is joining with Alif Semiconductor to show how the combination of powerful, highly efficient Arm®-based hardware and optimized models can make AI a reality at the edge. The companies are demonstrating Emza’s trained face detection model running on Alif’s Ensemble™ microcontroller (MCU), the first MCU featuring the Arm Ethos™-U55 microNPU. The Emza model runs an order of magnitude faster on the Ensemble device with Ethos-U55 compared to a CPU-only solution, with extremely low power consumption.

UiPath Partners with Adobe to Automate End-to-End Digital Document Processes and Workflows

UiPath Partners with Adobe to Automate End-to-End Digital Document Processes and Workflows

UiPath (NYSE: PATH), a leading enterprise automation software company, announced it has integrated its automation platform with digital document generation and e-signature capabilities from Adobe (NASDAQ: ADBE). By integrating with Adobe Document Services and Adobe Acrobat Sign to help customers automate end-to-end document processes, UiPath can boost employee productivity, enhance digital customer experiences, and lower costs through seamless, uninterrupted digital document workflows.

Fostering inclusive spaces through Disability Alliance

Fostering inclusive spaces through Disability Alliance

I was 2 when my parents discovered I had polio, which impacted my ability to stand and walk. Growing up in China, I still remember the challenges I faced when I wanted to go to college. Back then, all potential candidates had to pass a physical test, which posed a challenge. Knowing this, my parents, my teachers and even the local government advocated for me. Thanks to their support, I was granted an exception to attend college, where I graduated with a degree in computer science.When I joined Google in Shanghai in 2011, the real estate team was working to open a new office space. I was part of the planning process to ensure we designed an inclusive workspace, especially for individuals with physical disabilities. When I discovered the desks at the office were too high, or if the meeting space was not designed wide enough for someone in a wheelchair to enter, I worked with the team to solve the problem. I also suggested building wheelchair-accessible restrooms when they were not available on the floor I was working on.These experiences showed me everyone has the voice to drive change — including myself. I decided to co-lead our Disability Alliance (DA), one of Google’s resource groups in China, with other passionate Googlers. We wanted to create a space to help address challenges Googlers with disabilities face, and build allyship among the wider Google community. We also wanted to create a platform to increase awareness of different forms of disabilities. For example, some people don’t think about invisible disabilities, but it’s equally important to build awareness of disabilities you might not immediately see. I’m incredibly excited to see how we continue to grow our community in the coming year across China.Having a disability doesn’t limit me, and I’ve been fortunate to be surrounded by people who value my abilities instead of my disability. Over the years, I’ve achieved my goals and dreams from leading an incredible team of 50 at Google, taking on physical activities such as skiing and marathons, and driving change for the broader disability community.I was ready to compete in a marathon in China back in 2021As we commemorate Global Accessibility Awareness Day, I also spoke to Sakiko, a fellow member of our Disability Alliance chapter in Japan, to learn more about what drives her, and why it’s important that we provide equal opportunities for all.Sharing my personal experience at an external event. I’m seated at the extreme right in a gray sweater.Tell us more about yourself. What keeps you going at Google after more than nine years?I was born with spina bifida, and I move around with crutches. I’ve always wanted to work in sales, but when I was job hunting, I was turned down by several companies because of my disabilities. I knew I had the ability and knowledge to sell, and I enjoy interacting with people, so I didn’t give up. When I interviewed at Google, the interviewers focused on my potential and abilities, and not my disability. That surprised me, because I’ve never experienced that. I recalled asking one of my interviewers if my disability would impede this opportunity, but he said, “if you have the ability to sell, it shouldn’t stop you from doing that.” It was amazing and encouraging to hear that. I currently work in the Google Ads team and have experienced various roles. When my clients shared how grateful and thankful they are for my dedicated support, that really keeps me going.What is a memorable experience you’ve had with the Disability Alliance?I once hosted a workshop where we invited students with disabilities to have hands-on experience coding their own web application, giving them the confidence to pursue their interest in engineering. At the end of the event, several parents shared that they didn’t know their children had the potential to code and create applications all by themselves. I still remember this day vividly, because it demonstrates everyone has the chance to shine when they are given the right opportunities to learn and develop new skills.