Краткое описание: Саморазвитие без психолога и медитаций — с помощью эмоций и ощущений.
Описание: Эффективность методики доказана, и вы сможете в этом убедиться.
ПРОЩЕ МЕДИТАЦИЙ, А РЕЗУЛЬТАТ БЫСТРЕЕ – Не важен уровень подготовки. Подходит даже абсолютным новичкам. Вы почувствуете первый эффект уже через 8 минут. – Не нужно напрягаться, концентрироваться, занимать специальную позу, следить за дыханием. Достаточно сесть, закрыть глаза, вставить наушники, и слушать. Все работает само собой.
ПОМОЖЕТ НАШЕ НОУ-ХАУ – ГИПНОТЕЛЛИНГ Мы привыкли думать, что действуем рационально, но по факту это не так. В большинстве ситуаций нами руководят эмоции, которые мы плохо осознаем и мало контролируем.
Хорошая новость! От бессознательного все-таки можно получить то, чего мы хотим. Для этого нужно говорить с ним его языком: образами, чувствами, картинками и т.п. Бессознательное – как мартышка, оно не понимает языка, зато понимает наглядные вещи: ситуации, которые вызывают эмоции. С помощью этого его можно «дрессировать», «приручать», «воспитывать». Voice выстраивает диалог с бессознательным: учит, лечит, помогает.
Например, можно не только убрать тревогу, проблемы со сном, выгорание, нехватку сил, чувство неуверенности и т.п., но и разобраться с источником, проблемой, чтобы эти неприятные симптомы больше не приходили.
Ты оказываешься в фантастической истории, в которой ты – главный герой. Что-то вроде «психологического квеста». Двигаешься по сюжету, сталкиваешься с трудностями, как в реальной жизни, и учишься их преодолевать.
В отличие от медитаций, в наших треках поддерживается максимальный уровень активности и осознанности. Мозг входит в особое трансовое состояние, усиленно обучается и результат получается очень быстрым. Первый эффект можно заметить уже в первой истории, через 8 минут.
VOICE МОЖНО ДОВЕРЯТЬ Гипнотеллинг работает на базе метода, который использовали для подготовки спортсменов, ускорения обучения студентов, в лечении зависимостей, поддерживающей терапии, криминалистике, психотерапии и научно-исследовательских целях.
ГИПНОТЕЛЛИНГ ЭФФЕКТИВНО ПОМОГАЕТ Снизить уровень стресса, тревоги, раздражительности, нормализовать сон, повысить уровень энергии, легче вставать по утрам, делать дела, концентрироваться, получить «осознанность» как эффект медитаций совсем без медитаций, освоить крутые психологические навыки, которые будут помогать вам всю жизнь!
Требуется Android: 7.0 и выше Русский интерфейс: Да
Краткое описание: Дешёвые международные звонки и SMS
Требуется Android: 5.0 и выше Русский интерфейс: Нет
версия: 2019.22.249571726 Google Voice (Пост iMiKED #85754821) версия: 2019.20.248827150 Google Voice (Пост iMiKED #85499004) версия: 2019.18.246353881 Google Voice (Пост iMiKED #85083441) версия: 2019.18.244896381 Google Voice (Пост iMiKED #84948847) версия: 2019.16.244895543 Google Voice (Пост iMiKED #84704813) версия: 2019.16.242555088 Google Voice (Пост iMiKED #84584898) версия: 2019.14.240805279 Google Voice (Пост iMiKED #83991121) версия: 2019.11.240776012 Google Voice (Пост iMiKED #83855033) версия: 2019.11.237492715 Google Voice (Пост iMiKED #83495741) версия: 2019.09.234885482 Google Voice (Пост iMiKED #82814882) версия: 2019.07.232051384 Google Voice (Пост iMiKED #82305676) версия: 2019.05.230825622 Google Voice (Пост iMiKED #81885008) версия: 2019.03.227873883 Google Voice (Пост iMiKED #81395314) версия: 2018.50.224208644 Google Voice (Пост iMiKED #80069315) версия: 2018.47.221488369 Google Voice (Пост iMiKED #79228065) версия: 5.7.182806539 Сообщение №33, автор And_RU версия: 5.2.158469658 Google Voice (Пост treake #62782603) версия: 5.2.156378931 Google Voice (Пост Оптик #62320858) версия: 5.0.149662255 Google Voice (Пост VLADFIONOV #59387708) версия: 5.0.144897884 Google Voice (Пост Ansaros #48881168) версия: 0.4.7.10 Google Voice (Пост Ansaros #48881168) версия: 0.4.7.7 Google Voice (Пост Ansaros #48881168) версия: 0.4.7.6 Google Voice (Пост #34226253) версия: 0.4.2.54gv54.apk ( 4.53 МБ )
Сообщение отредактировал iMiKED — 30.11.21, 04:24
Так как для моего телефона файл на Маркете отсутствует, .apk был найден здесь: http://apk-filez.blogspot.com/2012/03/goog. oice-04254.html virustotal ничего не нашел 😉
p/s/ После активации все заработало, но этого для меня оказалось мало, :rolleyes: сразу же попытался пополнить баланс GoogleVoice (забыв отключиться от прокси США) в результате мне заблокировали Google кошелек :sveta: (будьте осторожны)
имхо колдовать лучше с «левым», а не с основным аккаунтом, удачи!
Источник
The vOICe for Android
Mobile augmented reality for a new way of seeing. with your ears!
March 25 2010: «Augmented Reality smartphone application helps visually impaired, elderly to see» (The vOICe for Android in The Independent UK)
April 19, 2010: The vOICe for Android running on Motorola Milestone in Berlin (YouTube clip)
September 25, 2010: Blind user reports on his use of Google nav and The vOICe for Android (Google’s eyes-free list) Japan running on NTT DoCoMo phone (blog post January 2010) — The vOICe for Android — Augmented Reality for the Blind
—>
March 20, 2013, The vOICe for Android was featured by Russell Holly on geek.com «App developer hopes to use Google Glass to help the blind see» February 6, 2013, The vOICe for Android was featured by Paul Warner of VICT Consultancy in his podcast «Understanding The vOICe for Android» June 15, 2012, The vOICe for Android mentioned in The Huffington Post article «Navigation Glasses For The Blind Help Visually Impaired See Through Sound» April 6, 2012: The vOICe for Android and Google’s Project Glass in Foundation Fighting Blindness blog article «We’ve Been Googled» March 16, 2012: The vOICe for Android mentioned in PLoS ONE paper «‘Visual’ Acuity of the Congenitally Blind Using Visual-to-Auditory Sensory Substitution» November 23, 2011, MP3 audio podcast of an interview about The vOICe for Android and augmented reality for the blind on That Android Show Episode 3
Camera-based sensory substitution and augmented reality for the blind (slides) invited presentation at ACIVS 2011 (Advanced Concepts for Intelligent Vision Systems), August 22, 2011, Ghent, Belgium
—>
Download —> Seeing with Sound: an Android Visual Prosthesis for the Blind? Bionic Eye? Smart Camera?
Note: Google offers a free screen reader named TalkBack, part of their Android Accessibility Suite. For general Android accessibility issues you should consult Google’s online Android Accessibility documentation. For best results disable TalkBack’s «Enhanced focus» feature, or else you will need to tap the main screen area before you can navigate it by sliding your finger. physical navigational controller , such as a d-pad, joystick, 4 arrow keys, trackball, or trackpad, for navigation through applications, menus and options. Recommended for general accessibility is to select a phone that also has a physical keyboard , as there is limited accessibility support for the virtual keyboard on the touch screen. —> physical keyboard for general accessibility , as there is limited accessibility support for the virtual keyboard on the touch screen. Also needed is a physical d-pad, trackpad, trackball, joystick or 4 arrow keys for navigation through applications, menus and options. However, a physical keyboard is not required for The vOICe for Android application. Moreover, by redefining the volume keys to act as up and down arrow keys while in menu and dialogs, The vOICe for Android is completely accessible even on phones that have a physical keyboard but no d-pad or equivalent. —>
—>
The vOICe for Android application adds a sonic augmented reality overlay to the live camera view in real-time, thereby giving even totally blind people live and detailed information about their visual environment that they would otherwise not perceive. It may also serve as an interactive mobile learning app for teaching visual concepts to blind children. The vOICe for Android is a universal translator for mapping images to sounds.
Once started, The vOICe for Android continuously grabs and sounds snapshots from the camera. Each camera snapshot is sounded via a polyphonic left-to-right scan through the snapshot while associating height with pitch and brightness with loudness. For example, a bright rising line on a dark background sounds as a rising pitch sweep, and a small bright spot sounds as a short beep. This approach allows for sounding arbitrary images while largely preserving the image content as needed in sensory substitution for the blind. The visual sounds encode a visual resolution of up to 176 × 64 pixels (comparable to an implant with 10,000 electrodes!). Note that The vOICe for Android always runs in landscape mode, i.e., rotated 90 degrees left with respect to portrait mode. Try it on simple visual patterns first or you will be utterly confused!
The vOICe for Android is available as a free application on Google Play and the Amazon Appstore, as well is in several smart glasses app stores.
Language
Google Play
English
Augmented Reality for the Blind: See with your Ears!
Deutsch
Erweiterte Realität für Blinde: Mit den Ohren sehen!
Français
Réalité Augmentée pour les Aveugles: Voir avec vos Oreilles!
Español
Realidad Aumentada para los Ciegos: ¡Vea con los Oídos!
Italiano
Realtà aumentata per non vedenti: Vedi con le tue orecchie!
Português de Portugal
Realidade Aumentada para os Cegos: Veja com os seus Ouvidos!
Português do Brasil
Realidade Aumentada para Cegos: Veja com seus Ouvidos!
Nederlands
Augmented Reality voor Blinden: Zien met je Oren!
Русский (Russian)
Расширенная Реальность для Слепых: Смотрите своими ушами!
Український (Ukrainian)
Доповнена реальність для сліпих: побачити своїми власними вухами!
Kiterjesztett valóság vakok számára: Lásson a füleivel!
Slovak
Rozšírená realita pre slepých: Pozerajte svojimi ušami!
Polish
Rozszerzona rzeczywistość dla osób niewidomych
Estonian
Laiendatud tegelikkus pimedate jaoks
中国 (Chinese)
为盲人增强现实: 用你的耳朵看!
한국어 (Korean)
시각장애인을 위한 증강현실:이제 귀로 볼 수 있습니다 !
العربية (Arabic)
الواقع المعزز للمكفوفين: أنظر بأذنيك!
Alternatively, you can download The vOICe for Android as a signed Android APK file directly from the URL
The vOICe for Android can also be obtained from Phoload. —> Installation directly from this or other websites using the Android browser may initially fail with a message like «Install blocked. For security, your phone is set to block installation of apps obtained from unknown sources». If this happens, you need to first enable «Unknown Sources» (from the main screen selecting Menu | Settings | Applications | Unknown Sources) to allow installation from the web, and try again. Do not download The vOICe for Android from other websites than those listed here, to minimize any risks of malware.
The vOICe for Android turns any Android device into a stand-alone computer vision system where all real-time image processing and audio synthesis is done by the device. Analogous to the fovea of the human retina, The vOICe for Android applies a foveal mapping to offer blind users a higher resolution central view and a lower resolution peripheral view.
For best augmented reality experience, The vOICe for Android should be applied with low-power camera glasses and stereo headphones. Suitable camera glasses for use with Android smartphones are not yet on the market, but self-contained Android smart glasses are. In particular, The vOICe for Android has been tested to run on Longway VISION-800 smart glasses (see the extensive usage notes for The vOICe). The vOICe for Android is known to also run on existing smart glasses from Vuzix, EPSON and ThirdEye Gen. However, you can also use Google Cardboard compatible devices to experience and practice sensory substitution at home with a cheap mobile setup. In case you need to disable the touch screen to avoid spurious touch input while using such a headset holding your smartphone, you can in the Options menu activate the entry «Disable touch until exit» to block touch events until you exit The vOICe app, or alternatively you can make use of the Argus Touch Blocker / Disable Touch app, and configure it to for instance unlock upon using the volume key, and while running The vOICe do a gentle sweep-down at the top of the screen to access the notification bar and activate Touch Blocker. Another cheap alternative for smart glasses can be the clipaphone.com (Clip-A-Phone) smartphone headmount that lets you mount your smartphone on a baseball cap. Beware: The vOICe for Android has not yet been tested on ORA-X! —>
Blind user of The vOICe, here running on VISION-800 smart glasses
Blind user of The vOICe, here running on VISION-800 smart glasses
Running The vOICe for Android on smart glasses
For details on the use of The vOICe with smart glasses visit the smart glasses usage notes page, which focuses on the sub-$200 VISION-800 smart glasses.
Smartglasses GUI
user action
Google Glass
Vuzix Blade
Tap
Toggle muted state
Enter Options menu (or select inside menu)
Double tap
Take hi-res photograph
Long press, after beep
vOICe command input
Swipe/fling left/back
Toggle negative video
2 second soundscape (or move up inside menu)
Swipe/fling right/forward
Toggle color identifier
0.5 second soundscape (or move down inside menu)
Swipe up (fling up)
Toggle smartphone GUI
Faster speech
Swipe down (fling down)
Exit («Back button»)
Slower speech
1-finger 1 second press
Take hi-res photograph
2-finger forward swipe
Toggle negative video
2-finger backward swipe
10-second 90 degree rotation
Pinch out
Increase sound volume
Pinch in
Decrease sound volume
2-finger upward swipe
Increase sound volume
2-finger downward swipe
Decrease sound volume
2-finger tap
Exit («Back button»)
In case of running The vOICe for Android directly on smart glasses it is often possible and advisable to use an external battery pack (e.g. from eBay or Amazon) with a USB cable to keep the relatively small battery from draining too quickly. In the case of Google Glass it is recommended to run The vOICe for Android only for up to a few minutes at a time to avoid overheating.
turn on debug mode on Google Glass to enable ADB access: go to the Settings card and then to the Device info card, and tap to enter this; next swipe forward until you see «Turn on debug», and tap again to enable debug. From a DOS shell on the PC, within the folder where you saved vOICeAndroid.apk, apply the command adb install -r vOICeAndroid.apk to install The vOICe for Android (the -r option replaces an existing installation), followed by adb shell am start -n vOICe.vOICe/.The_vOICe to launch The vOICe for Android on Glass (and use «adb shell am force-stop vOICe.vOICe» to stop The vOICe). The Glass screen must be on when launching The vOICe, so you may need to tap the touchpad a few times during the launch to avoid the 10 second screen timeout of Glass. —> With smart glasses that lack access to the Google Play store, you can still easily install The vOICe for Android by side-loading. To install, connect your smart glasses to your PC via a USB cable. From a DOS shell on the PC, within the folder where you saved vOICeAndroid.apk, apply the ADB command adb install -r vOICeAndroid.apk to install The vOICe for Android (the -r option replaces an existing installation), followed by adb shell am start -n vOICe.vOICe/.The_vOICe to launch The vOICe for Android on the connected smart glasses. You can use use «adb shell am force-stop vOICe.vOICe» to stop The vOICe, and «adb uninstall vOICe.vOICe» to uninstall The vOICe. logcat.txt adb logcat -s «OIC», «GEST» > logcat.txt To kill The vOICe upon ANR adb shell am force-stop vOICe.vOICe https://developers.google.com/glass/gdk http://neatocode.tumblr.com/post/49566072064/mirroring-google-glass adb shell am start com.google.android.marvin.talkback/.TalkBackPreferencesActivity Security exception when trying adb shell am startservice com.google.android.marvin.talkback/.TalkBackService adb shell am startservice com.google.android.marvin.talkback Enabling TalkBack via ADB: https://code.google.com/p/eyes-free/downloads/list adb install .apk http://grokbase.com/t/gg/eyes-free/141c07jyc2/enabling-talkback-via-shell —> ADB (Android Debug Bridge) is a standard tool included with the Android SDK. «OK Glass — start imaging — The vOICe for Android» .
When running The vOICe on smart glasses, the graphical user interface (GUI) is by default different from the touch screen interface on a phone or tablet: compare the tables for «smartphone GUI» and «smartglasses GUI». On smart glasses you typically interact with The vOICe using a touchpad for swiping gestures and tapping. A long press activates the voice command input just like tapping the microphone does for the smartphone GUI for phones and tablets. Using two fingers to pinch in and out controls sound volume, a fast swipe (fling) left toggles negative video, and a fast swipe right toggles the talking color identifier on and off. If desired, you can switch between smartphone GUI and smartglasses GUI through a fast swipe up (fling up) gesture. This can be useful for changing advanced settings in the Options menu. For example, when run on smart glasses, The vOICe by default launches automatically after each reboot (device restart) to suit blind users in view of the often limited blind accessibility of smart glasses, but you can disable this by toggling the «autostart» option in the «Other settings» menu of the smartphone GUI.
Note that most information on this web page, unless indicated otherwise, still refers to the smartphone GUI for phones and tablets. Moreover, GUI behavior can be different for some brands and types of smart glasses, and the smartglasses GUI table now for instance contains a separate column describing user interaction with the Vuzix Blade glasses. In addition, the Vuzix Blade glasses can be conveniently controlled by the Vuzix Companion app, which runs on a Bluetooth-connected smartphone such that Google TalkBack can be used for blind accessibility.
Running The vOICe for Android with USB camera glasses on a Windows PC
The vOICe for Android can run on a notebook or netbook PC with Microsoft Windows via the BlueStacks App Player for Windows. In this manner, The vOICe for Android can work with affordable wide-angle USB camera glasses that require a Windows driver. You can (temporarily) disable any built-in webcam via Device Manager to ensure that the BlueStacks App Player will by default select the external USB camera glasses for use with The vOICe for Android.
Smartphone GUI for phones/tablets
Action (upon double tap)
Top left, left half PIP1 *
Apply edge detection
Top left, left of PIP1
Slow down scanning
Top left, right half PIP1
Take hi-res photograph
Top right, left half PIP2
Toggle negative video
Top right, right half PIP2
Toggle color identifier
Top right, right of PIP2
Speed up scanning
Bottom left corner
vOICe command input
Bottom middle
Pop up Options menu
Bottom right corner
Launch soft keyboard
Left screen edge
Launch CamFind **
—>
Launch Eye-D **
Right screen edge
Launch Google Lookout **
All other locations
Toggle muted state
Swipe up (fling up)
Toggle smartglasses GUI
Swipe down (fling down)
Toggle Google Cardboard mode
* PIP = picture-in-picture camera preview area ** or user-defined app (specified by package name)
** Double tap to toggle all camera functions on/off
—>
The vOICe Key
vOICe Command ***
Result
Default
Enter
Open Options menu
0 (or double tap)
Toggle muted state
Off
— (dash)
Toggle camera release
Off
_ (underscore)
«changes»
Toggle view change detection
Off
space
«where am i»
Say detailed location (GPS)
x
«X-ray look-around locator» (GPS+compass)
Off
capital X
Look-ahead locator, a few seconds ahead (GPS)
Off
capital S
«speed»
Say current speed (GPS)
unknown
capital A
«altitude»
Say current altitude (GPS)
unknown
d
«date»
Say current date
t
«time»
Say current time
l
«level»
Say battery level
100%
capital L
«light»
Say light level (lux)
1 (Alt-W)
«inverse»
Toggle negative video
Off
3 (Alt-R)
Toggle speech feedback
On
7 (Alt-Z)
Toggle color preview
On
9 (Alt-C)
Cycle contrast enhancement
100%
* or q (Alt-Q)
«color»
Toggle talking color identifier
Off
« (double quote)
«object»
Toggle talking object identifier
Off
# (Alt-A)
«torch»
Toggle torchlight if otherwise off
Off
Volume up, down
«loud», «soft»
Cycle sound volume levels
20%
# (Alt-A)
«soft», «loud»
Cycles sound volume levels
20%
—>
DPAD up
Higher speech rate, up to 3*normal
1.5x normal
DPAD down
Lower speech rate, down to 0.5*normal
1.5x normal
DPAD left
«slow»
2 second soundscape
1 second
DPAD right
«fast»
0.5 second soundscape
1 second
!
«barcode»
Launches barcode scanner
Off
—>
. (dot, period)
«braille»
Toggle tactile graphics display
Off
capital I
«blind»
Cataract simulator
Off
/ (slash)
Load image file and turn on tactile graphics
Off
r
«red»
Red-only color filter
Off
g
«green»
Green-only color filter
Off
b
«blue»
Blue-only color filter
Off
c
«cyan»
Cyan-only color filter
Off
y
«yellow»
Yellow-only color filter
Off
o
«orange»
Orange-only color filter
Off
m
«magenta»
Magenta-only color filter
Off
s
«skin»
Skin-only color filter
Off
a
Analyze colors by cycling filters
Off
w
White calibration
Off
f
«focus»
Apply autofocus
|
Toggle barcode detection
On
capital O
«OCR»
Toggle continuous high resolution OCR
Off
p | camera button | DPAD center
«picture»
Save hi-res photograph
capital P
Share camera preview image
e
«edge»
Edge detection *
Off
z
«zoom»
Zoom * × 2
Off
capital Z
«telescope»
Zoom * × 4
Off
n
«noise»
Image noise filter (3 states)
Off
!
«leftmost», «describe»
Launch CamFind **
—>
Launch Eye-D **
?
«rightmost», «identify»
Launch Google Lookout **
«rotate»
Rotate camera view 90° counterclockwise for 10 seconds
On
u
Activate UVC camera
off
—>
Bat chirp
Off
capital R
«reset»
Reset to default settings
Search key
Toggle touch screen
On
Open soft keyboard
Hidden
* only on ARM, x86 and MIPS devices ** or user-defined app (specified by package name) *** only on devices with speech recognition support
Real-time talking OCR
The vOICe for Android includes live text recognition with speech, meant for reading brief messages, newspaper headlines and labels. For reading whole pages of print you should revert to dedicated OCR (optical character recognition) packages. Soundscapes from The vOICe and any recognized text sound simultaneously, such that you can use the soundscapes to center the text in the camera view for better recognition. The soundscapes also indicate font size through the text rhythm and the angle of lines of print through changes in pitch. The vOICe refreshes the spoken text every few seconds to track changes in the camera view. This can mean that not all text is spoken before the next refresh occurs, then cutting the currently spoken text short. If that happens, simply looking away from the text can help to hear all the recognized text in the camera view. Any recognized text is automatically copied to the system clipboard, such that you can readily paste it into other apps if desired.
By default, The vOICe applies low resolution live OCR to keep you from being distracted too often by text appearing somewhere in your environmental view, and only switches automatically to high resolution live OCR once some large font text is detected in order to get more detail if available. However, if you are on the lookout for some text you can force The vOICe to apply high resolution live OCR continuously by toggling the continuous high resolution OCR option with key capital ‘O’ or the voice command «OCR».
Real-time talking object recognition
The vOICe for Android uses artificial intelligence to offer experimental support for real-time talking object recognition. The names of recognized objects are spoken along with confidence numbers between 0 and 1. The ultimate goal of the talking object recognition feature is to support you in learning to see with sound by giving you hints of what may lie ahead of you, thus helping you interpret the often very complex soundscapes of uncensored raw vision. It is intended only as a stepping stone towards letting you «see for yourself». In the end you should have less and less need for the automatic object recognition feature, much like you no longer have a need for a translator app once you become fluent in a foreign language. With the feature turned on, you may for example hear the soundscape rhythm of curtain folds, and the object recognizer should at the same time say something like «curtains». The deep learning neural network based feature is currently still unreliable and by default turned off, but you can easily toggle it on and off via the Options menu, entry «Say object names». Also beware that speaking the names of recognized objects inevitably implies some degree of censorship, because there are tyically many more objects in a scene than can be spoken in real-time, such that a biased selection must be made.
Talking color identifier
The «*» (star, asterisk) or «q» key and the alternative «Alt-Q» key combination toggle the talking color identifier on and off. This mobile color recognizer for blind and color-blind people speaks the color of whatever shows at the center of the camera view. Note that results of color recognition depend on ambient light and camera quality. Recognized colors include (dark, normal, and light) red, green, blue, cyan, yellow, orange and magenta, as well as combination colors such as red-orange. Further color qualifiers exist for pale and vivid colors. Black, grey and white are also identified.
Further options exist for color filters, even including a filter for detecting exposed skin, e.g., for face detection, for detecting people or to help find empty seats. Color filters are toggled by pressing the first letter of the supported color name, such as «g» for green. Part of the screen’s background will track the selected color filter or the identified color. The color filters can also be used by blind and color-blind people to pick wires of a user-selected color, e.g., to distinguish red or yellow wires in electronics by pressing «r» or «y» for the red and yellow color filters. If it is not known in advance which colors are present in the view, key «a» can be applied to analyze the view by cycling over the supported colors. Beware though that the choice of color names can be culturally biased: cyan is a color in between green and blue, while magenta is basically the same as the color purple. Also, light-magenta and light-red make for the color pink or very similar colors, while dark-orange appears as a shade of brown. Dark yellow-green makes for olive-green.
When using color identifier or color filter, The vOICe automatically turns on the torchlight (LED flash) on phones that support this. This can if desired be suppressed via the «Other settings» menu. The torchlight can bring performance a little closer to that of dedicated color identifiers that have a built-in reference light.
Note that results of color recognition inevitably depend on ambient light and camera quality. Try to use good lighting whenever possible, preferably broad daylight. Still, under relatively low light conditions, better results may be obtained by first calibrating The vOICe for the given visual environment. To do this, point the camera to a known white surface (such as a white sheet of paper) near the object of which you want to identify the color, and apply «white calibration» by pressing key «w». It will basically tell The vOICe that this surface really is white or light grey rather than its actual grey or colored appearance in the camera view. This will also correct for the yellowish colors from incandescent lighting and many other sources of color bias. Next you can point the camera to other items of interest to identify their colors. Apply the calibration option with great care, because it is a double-edged sword: only apply it when you are certain that the full camera view is indeed white and relatively bright, or else you may get very poor color identification results due to a badly skewed color calibration! Calibration settings do not persist across runs to avoid unintended continued use of a calibration that would no longer match changing ambient light conditions. The vOICe does not normally need calibration in broad daylight conditions, but if applied with care, it can yield significantly more accurate color recognition results under relatively low light conditions. The calibration process takes only about a split second and applies for the duration of the run unless you recalibrate or reset The vOICe.
Talking compass
The vOICe for Android includes a talking compass that speaks the current heading. By default it only speaks heading changes, but options for continuous heading feedback and for disabling the talking compass are available via the Options menu. The eight supported spoken compass directions are north, north-east, east, south-east, south, south-west, west and north-west. The screen’s see-through overlay with a virtual 3D foreground visually tracks detected changes in direction. Note that your device’s compass may be auto-calibrating: if it appears to give wrong results, move it slowly around in all directions (e.g., figure eights) to help the compass recalibrate. You can change verbosity via the Talking compass dialog in the Options menu. In case you want to know not only the heading (azimuth), you can select the 3D rotation option, which speaks heading (yaw), inclination (slope, pitch) and roll in degrees. The vOICe then acts like a talking inclinometer or theodolite.
Beware: On phones with a physical keyboard, the keyboard must be closed for the talking compass to speak! This is because the magnetic distortion by an open keyboard can give bad compass readings.
Together with the camera-based soundscapes from The vOICe, the talking compass may help to walk in a straight line (avoiding veering through the continuous speech feedback option in the talking compass dialog), and of course the compass continues to work in the dark where the camera fails to give useful feedback. Note that you can if necessary recalibrate the compass of your device by opening the Google Maps app and then moving your device in figure 8 patterns.
The vOICe for Android includes a talking locator that speaks nearby street names and intersections as determined from GPS (or the Russian ГЛОНАСС — Glonass) satellites or local cell towers, for increased location awareness. In other words, it helps you to know where you are. As supplemental information it can upon request tell you the current speed and altitude, although that information is not always available and can take some time for the GPS system to acquire. You can change verbosity via the Talking locator dialog in the Options menu. In the Other settings dialog you can select the metric system (default) or the US/Imperial system of units. The talking locator makes use of the Google Maps API, and thus needs to share your location with Google in the same way that Google Maps or Google Navigation does. It is not shared with anyone else.
Special «x-ray» locator mode: key ‘x’ toggles a look-around locator mode that speaks street names and intersections ahead of your physical location, unobstructed by any buildings. It tells you what is in the direction that you are pointing your phone, and at a distance that depends on the forward or backward tilt of your phone. This mode requires access to GPS, magnetic compass and acceleration sensor. When your phone is in landscape orientation and held upright — like you would when taking a photograph — the location range is at 100 meters, but when you tilt the phone forward or backward this range increases or decreases. When you tilt the phone forward until it is held horizontally — with the camera pointing to the ground — the range increases to 300 meter. If you tilt the phone still further forward you can have a look-around range up to 500 meter maximum. So you get a kind of spoken «scan» of what streets and intersections lie on a line ahead of you by tilting the phone without changing your heading, and by changing your heading (as in looking around) without changing the tilt of your phone you can scan street names along a circle of a radius belonging to the current tilt angle. By tilting the phone upward towards the sky you can shorten the range down to about 25 meter minimum. To avoid confusion this mode is best used while standing still as you turn around and play with the tilt angle. Note that you can always double-tap the middle of The vOICe main screen or press the power button to mute the soundscapes. This can help with speech intelligibility when you do not need the soundscapes.
Another (look-ahead locator) mode, toggled by key capital ‘X’, is rather aimed at use inside a vehicle such as a bus, taxi or train. This mode requires access to GPS. It speaks streets and intersections in the direction that you are traveling, typically several seconds ahead of the moment that you will pass them. Beware that in this case the heading direction is not derived from the orientation of your phone but instead from the direction that you are traveling, as obtained from subsequent GPS readings. So you must be moving for this to work. The look-ahead time is larger when your phone is held horizontally (camera pointing to the ground) than when it is held upright.
A special look-around mode is activated whenever the camera is aligned with the horizon: in that case The vOICe will instead of your current location speak street names and intersections that lie 100 meters ahead of you in the direction that you are pointing the camera. Thus you can effectively scan your environment for nearby streets and intersections.
—> Real-time navigation and route guidance is currently not supported by The vOICe for Android, because use of Google Maps data for this purpose is not allowed without prior authorization, with the ToS (Terms of Service for Google Maps) reading:
«Unless you have received prior written authorization from Google [. ], you must not [. ] use the Service or Content with any products, systems, or applications for or in connection with (i) real time navigation or route guidance, including but not limited to turn-by-turn route guidance that is synchronized to the position of a user’s sensor-enabled device; or (ii) any systems or functions for automatic or autonomous control of vehicle behavior.»
However, Google Maps on Android now offers free turn-by-turn GPS based Google Navigation with voice guidance, also for spoken walking directions. This functionality can be applied by blind users in combination with The vOICe for Android (blind user report on eyes-free list). The vOICe for Android must be running in the foreground, so one first starts Google Navigation and sets it up for giving directions at a suitable sound volume, and next one starts The vOICe for Android ( usage details on eyes-free list). To prevent accidental touch screen events with The vOICe, you can toggle sensitivity for touch screen events on and off using the Search key of your phone. In case you accessed Google Navigation to change its settings, you can easily return to The vOICe by acting as if restarting it, e.g. via a long-press on the Home key to select it from the recently run apps.
As with the talking compass, the talking locator continues to work in the dark where the camera fails to give useful feedback. In situations where feedback from the camera is not needed or desired, or for a longer battery life, you can toggle and release the camera either by tapping the center of the touch screen twice rapidly, or by pressing key «-» (dash, minus). Speech feedback from the talking compass and talking locator will continue, but there will be no soundscapes and no speech feedback from the talking color identifier as long as the camera remains released. Note that this is different from tapping the center of the touch screen once or pressing key «0» once, which toggles muting of all sound and does not involve releasing the camera.
Note: to let others (selected «friends») know where you are, you can just use «Google Latitude.
«Called vOICe, the software [. ] is being designed to integrate with hand-held GPS systems when they come on the market. Dr. Meijer envisions the user of his device guided by GPS and speech software to the door of a building, then prompted about the location of the knob by the finely tuned camera view. He says the technology will allow a user to accurately grasp a cup of coffee just set down on a table.» The Christian Science Monitor, July 10, 2003, Computers & Technology section, in an article by Lakshmi Sandhana titled «Seeing-eye and navigation technologies mean more freedom for the blind, A hand-held device that reads GPS signals, and one with a mini-camera, promise big advances».
Another Google Maps accessibility tool developed by Google is Intersection Explorer, which speaks the street names at the intersection nearest to the position touched on the screen, and uses vibration to indicate the presence of streets. For other GPS-based navigation options, check out Vodafone Wayfinder OSS (open source) and Loadstone GPS (open source).
Talking face detector
The vOICe for Android includes a talking face detector that speaks the number of human faces detected in the live camera view. It can detect and announce up to dozens of faces in a single view. On the other hand, if only one face is detected then it will additionally say whether the face is located near the top, bottom, left, right, or center of the view, as well as speak when the face is within close-up range. The talking face detection feature can help with blind photography as well as with many other situations that need judging the presence of people. Note that the talking face detector supplements the optional use of the skin color filter which can give more detailed information about the size and position of multiple faces in the camera view.
Face detection is just that: it is not face recognition, so there are no privacy concerns. Moreover, the talking face detector can be turned on and off via the «Other settings» menu. Keeping face detection turned off can be advisable when mostly using the soundscapes, because face detection adds a noticeable CPU load and hence battery drain. You may also notice a significant face detection lag of several seconds, and an overall more sluggish response while the talking face detector is enabled.
Good vibrations: refreshable tactile graphics
The vOICe for Android has an experimental tactile display mode, toggled via the Options menu, by pressing key «.» (dot, period), or by applying the mnemonic voice command «braille». The tactile graphics mode allows you to feel the live camera view using the touch screen. Thus you can explore the view by touch. Any bright areas in the live camera view cause vibration as you slowly swipe your finger over the touch screen. Note that the perceptual effect is crude and limited by the simplicity of the phone’s vibrator, which for instance cannot track fast motion if based on a vibration motor, and is not suitable for representing multiple levels of brightness. However, unlike electrovibration, vibration continues at a bright spot also if your finger does not move. The adequacy for tracing a view is strongly dependent on visual contrast and lighting, and thus requires some experimentation. To follow the edge of a bright area, zigzag your finger in a wavy manner as you trace the edge. You can apply negative video, color filters or edge detection if needed for best results.
You simultaneously get the complete view in sound, thus providing you with the visual context and a fast overview, but the supplementary tactile feedback can help with the exploration and interpretation of complex views and their correspondingly complex soundscapes. You can point the phone to any high-contrast printed graphics such as a bar chart, for use as a highly affordable and mobile refreshable tactile graphics display, but it goes far beyond that. For instance, one can think of feeling the rows of windows in a building from a distance, locating doorways if visual contrast is adequate. Note that this may also prove useful for totally deaf-blind people.
All other touch screen event handling is disabled while running in this mode, such that the full view is available for tactile exploration. Note that the vibration motor of a phone can sometimes get stuck in a metastable position such that it no longer vibrates; a gentle tap on the side of the phone can then often help to release it and regain vibration.
A special multitouch feature lets you select a rectangular area of interest, using two fingers for opposite corners of the rectangle. Tactile feedback will then be switched off, and you will hear the soundscape only for the selected area. This lets you more easily explore smaller parts in complex soundscapes. In addition, the 2-finger area selection also masks the talking OCR outside the selected area, for instance for use in locating and identifying axis labels in graphs on paper or on a computer screen by only hearing any text within the selected area.
If you want to explore an image file you can do so by pressing key «/» (slash), which will pop up a file requester for loading an image file. After selecting the image file of interest, its soundscape will be sounding and the tactile graphics display will be active, such that you can explore the image by touch. You can press the Back button to leave this mode.
By using the TalkBack screen reader in combination with the BrailleBack accessibility service, even completely deafblind people can access and make use of The vOICe’s tactile graphics mode. Or if you want to add some extra tactile oomph to The vOICe soundscapes, you may consider using the headphones jack of your device to pass the soundscapes directly to a KOR-FX 4DFX haptic gaming vest or to David Eagleman’s and Scott Novich’s vibratory VEST (once available). The above-mentioned 2-finger area selection can here be helpful to control what part of the soundscape activates the tactile vest. The tactile feedback might also prove supplementary in navigating noisy environments that would otherwise drown out the soundscapes.
The cataract simulation mode displays a heavily blurred camera view, similar to what people with severe cataracts see. It is meant for normally sighted users wearing smart glasses who want to experience and visualize how The vOICe adds visual detail to very low vision. Seeing the degraded visual view while at the same time hearing the visually more detailed soundscapes tempts the brain of the sighted user to fill in visual detail that is missing on the visual display with mental imagery of what can be heard in the soundscapes. Sensorimotor contingencies in sensory substitution with The vOICe here match low grade «cataract» vision with higher grade soundscapes. Give it a try and experience to what extent the extra visual detail in the soundscapes becomes a truly visual experience. You can toggle the cataract simulator on and off in the «Other settings» menu, or use the shortcut key or voice command.
Shake The vOICe
By briefly shaking your phone you can initiate a user-specified action, as set in the Shake action dialog in the Options menu. You can shake the phone to either toggle mute, camera release, color identifier or torchlight, or you can have your phone say the current location, time, speed or altitude. It is also possible to disable the shake actions by selecting «do nothing».
Front-facing cameras in Android display the user’s face as if looking into a mirror. However, the left-right flipped view is of limited use for blind users of The vOICe for Android. In the case of the Nexus 7, which only has a front-facing camera and no back-facing camera, it is generally more useful to treat a front-facing camera like a back-facing camera, i.e., turning the phone around to make the camera face away from the user. For this reason, The vOICe makes corresponding mapping changes, such as scanning the mirroring camera view from right to left, changing the compass directions by 180 degrees, and changing the spoken location of human faces. In other words, The vOICe assumes that the front camera is used facing forward, away from the user. The downside for sighted users is that results will seem «wrong» when the front camera is still facing them.
Need wide-angle view?
Especially in mobility applications, it is important to detect objects and obstacles to the side, as well as at floor level in front of you. This may require a wider field of view (FOV) than most phone cameras provide. For this purpose a convenient accessory is the USBfever 180° magnetic detachable fish-eye lens, which does not require removal of the original lens but is simply magnetically mounted in front of the phone lens, and is compatible with multiple phones. Note that the 180° lens — despite the name — in general does not give a 180° view but a view that is 2 to 3 times wider than without the add-on lens, up to a maximum of about 180°, e.g., one gets a 80° to 120° view in case the view was 40°.
A wide-angle view can also help blind users of The vOICe to more quickly and easily build a mental overview of their visual environment. It is best used in combination with the default foveal mapping to avoid that items near the center of the camera view appear very small. Note that the camera of Google Glass offers a horizontal FOV of 54.8° and a vertical FOV of 42.5°, and there are no known options yet to increase the viewing angle through an add-on lens.
Virtual reality (Google Cardboard) mode for augmented reality
The virtual reality (VR) mode is meant only for sighted users of Google Cardboard and compatible devices (ColorCross, Vrizzmo, Fibrum, Durovis Dive, Homido, etc), preferably with a head strap for hands-free use. The headset must have an opening on the front side where your phone’s camera lens is located, such that the camera view is not blocked. The VR mode is toggled by a swipe down on the main screen, and shows two identical views side by side (a single camera does not allow for truly stereoscopic views). It lets sighted users experience The vOICe immersively, with the low resolution view matching the visual content of the soundscapes, including foveal enlargement and peripheral compression. This makes it easier to learn to associate visual content with the corresponding soundscape content, and lets sighted people better understand what The vOICe can mean for blind people. Apart from the swipe-down toggle, the controls in the VR mode are the same as in smartglasses GUI. For best results connect stereo headphones, while you may also want to turn on the full screen mode via the «Other settings» menu and restart. Optional extensions include a fish-eye lens for a truly wide-angle view, and a Rii wireless mini-keyboard to remotely control any settings of The vOICe. For more information about learning to see with sound, check out The vOICe Training Manual for the blind.
The vOICe for Android includes experimental support for UVC compliant cameras (UVC compliant webcams such as the Logitech HD Webcam C270, with USB device ID 046d:0825). UVC stands for USB Video Class, or Universal Video Class. If a UVC compliant external camera is detected at startup of The vOICe, it will automatically attempt to use that UVC camera instead of any built-in camera. It works on regular non-rooted Android phones and tablets with stock ROMs, but it requires that the Android device runs Android 4.0 or later and has USB host support. UVC camera support unfortunately no longer works on non-rooted Android 4.4 (Kitkat) and later devices due to tightened security in the SELinux (Security-Enhanced Linux) layer underneath Android. It may also lead to stability problems on devices that lack adequate support for UVC compliant cameras, that have incomplete USB host support, or that cannot handle the power drain of the UVC camera. Some functions are not available when using a UVC camera, such as face detection or taking a snapshot. Reported quirks in using a UVC compliant webcam include frequently unrecognized cameras, getting no audio unless headphones are connected, stuttering audio, flickering camera view, and freezing devices (requiring a full reboot, no fastboot). The UVC camera mode reportedly crashes some Samsung devices types (e.g. Galaxy S3, I8262). So only use UVC compliant cameras with The vOICe at your own risk. UVC camera support can be completely disabled via the «Other settings» menu. It is hoped that in the future there will come affordable UVC compliant camera glasses, which is the main reason why UVC camera support is being developed.
—> Stereo and 3D audio
The vOICe for Android supports stereo panning and 3D audio spatialization on suitable stereo-enabled Android devices with a stereo headset, for instance using a Bluetooth stereo headset for a convenient wireless connection with your phone, although many modern Android phones also support the use of regular wired audio headsets. The vOICe will detect if a wireless headset is connected and then switch to stereo or mono accordingly. In the case of a Bluetooth stereo headset consult your headset manual for details on how to activate and pair it with your phone, and on your phone select appropriate settings in Settings | Bluetooth for activating Bluetooth, pairing the bluetooth headset with your phone, and routing media audio to it (a long-press on a connected bluetooth device gives further options). Sighted users may notice that the visual scanline moves out of sync with the panning audio by up to several hundred milliseconds when using a Bluetooth headset. This is because Bluetooth A2DP tends to add extra latency. Better results may be expected from Android devices and Bluetooth headset that are aptX compliant.
The stereo panning makes perception of lateral position in image scans more intuitive, while binaural cues offered by 3D audio (spatialized audio) may further help with object segregation in complex scenes. Options for Mono (default), Stereo and 3D audio are available in the Audio rendering dialog in the Options menu. Mono is the default because many users will on their first run of The vOICe for Android software not be wearing a stereo headset, and devices lacking stereo capabilities may give distorted mono sound when set to the Stereo or 3D audio mode. The Stereo and 3D audio mode also add to the CPU load.
The vOICe’s stereo and 3D audio is optimized for application in immersive synthetic vision rather than for psychoacoustic positional audio realism, and it is designed for use with a discrete stereo headset rather than a surround sound speaker set. Blind users can wear regular open stereo headphones just above or just behind the ears for minimial obstruction of environmental sounds and for maximum compatibility with natural echolocation.
While headphones are connected or while a screen reader is running, some of the «eye candy» display features such as the semi-transparent backdrop images are automatically disabled for best performance during serious use.
Speech recognition for voice commands
The vOICe for Android offers voice command support on devices with speech recognition — typically those that come with Google voice search. Tapping the microphone image on the touch screen at the bottom left activates speech recognition for several seconds, during which you can give any of a number of voice commands: changes, where am i, speed, altitude, date, time, level, light, inverse, color, torch, loud, soft, slow, fast, braille, red, green, blue, cyan, yellow, orange, magenta, skin, focus, picture, edge, zoom, telescope, leftmost (or describe), rightmost (or identify), reset. For instance, tapping the microphone and saying «skin» should toggle the skin color filter. Saying «picture» will take a high resolution snapshot. Saying «level» gives the battery level as a percentage of a full charge. (The battery level is also automatically spoken once it drops to 10 percent or lower, to indicate that the battery may soon be getting critical.) The speech recognition mode times out after several seconds, such that for each new command you will need to tap the bottom left of the touch screen again. Note that speech recognition reliability can be poor depending on ambient noise and the quality of the microphone and the speech recognition engine. Speech recognition currently requires a remote connection to Google’s servers for sending and analyzing the speech sounds. If the microphone image is not showing on your device then no speech recognizer is installed. In that case you can install Google Voice Search to obtain a speech recognizer.
IP camera support
The vOICe for Android supports the use of IP cameras that offer live MJPEG video streams. Usage involves specifying a suitable IP camera URL in the Camera dialog in the Options menu. The feature is aimed at use with future IP camera glasses that offer a wireless connection with your smartphone via your local WiFi network or via a portable WiFi hotspot on your smartphone. Note that latency (camera view lag) can be significant, depending on brand and type of IP camera and the IP camera’s settings.
An example of IP camera glasses is given in the YouTube video clip Aishine AI-G03S Wifi camera glasses with The vOICe for Android.
Support for using CamFind visual search
The vOICe for Android supports use of the CamFind visual search engine, available from Google Play. Touching the left edge of the screen launches CamFind, if installed, or finds CamFind on Google Play to let you install it. CamFind uses crowdsourced human image recognition to return results, typically within half a minute. The quality of human image recognition is generally still far beyond that of machine vision based image recognition. As an alternative for tapping the left screen edge you can also tap the microphone image at the bottom left of the screen and say «leftmost». Note that CamFind runs in portrait orientation. You can use the Back button to return to The vOICe for Android.
—> Support for using Eye-D object recognition
The vOICe for Android supports use of the Eye-D object recognition engine, available from Google Play. Touching the left edge of the screen launches Eye-D, if installed, or finds Eye-D on Google Play to let you install it. As an alternative for tapping the left screen edge you can also tap the microphone image at the bottom left of the screen and say «leftmost». Note that Eye-D runs in portrait orientation. You can use the Back button to return to The vOICe for Android.
Support for using Aipoly Vision
The vOICe for Android supports use of the Aipoly Vision object identifier, available from Google Play. Touching the right edge of the screen launches Aipoly, if installed, or finds Aipoly on Google Play to let you install it. The quality of machine vision based image recognition is typically worse than human image recognition, but the turnaround time can be much shorter. As an alternative for tapping the right screen edge you can also tap the microphone image at the bottom left of the screen and say «right edge». Note that Aipoly runs in portrait orientation. You can use the Back button to return to The vOICe for Android.
—> Support for using Google Lookout
The vOICe for Android supports use of the Google Lookout object identifier, available from Google Play or as a 64-bit APK file for sideloading from APKMirror. Touching the right edge of the screen launches Google Lookout, if installed, or finds it on APKMirror to let you install it (on Google Play currently Google Pixel phones are supported). The quality of machine vision based image recognition is typically worse than human image recognition, but the turnaround time can be much shorter. As an alternative for tapping the right screen edge you can also tap the microphone image at the bottom left of the screen and say «right edge». Note that Google Lookout runs in portrait orientation. You can use the Back button to return to The vOICe for Android.
In the future there may follow support for an Android version of Microsoft’s SeeingAI app. Support for barcodes and QR codes
The vOICe for Android supports use of the free ZXing barcode image processing library, available from Google Play. You can first use The vOICe to locate a barcode or QR code «visually» from the characteristic sounds of these patterns. Barcodes are fairly small, such that the phone’s camera must typically be held some 10 to 20 centimeters from the object holding the barcode, and the camera view may be blurred at such a short range. Apply autofocus by pressing key «f» or tapping the top middle of the screen to sharpen the camera view and make the barcode rhythm sound much more «crisp» at close-up. Touching the left edge of the screen then triggers the barcode scanner, if installed, or finds ZXing on the Google Play to let you install it. As an alternative for tapping the left screen edge you can also tap the microphone image at the bottom left of the screen and say «barcode». If a QR code is found and recognized, the browser gets launched for what is assumed to be an associated URL. Otherwise, if a barcode is found and recognized, the browser gets launched and a Google search for the barcode is performed. It is assumed here that you have the necessary accessibility tools for browsing the web, because The vOICe cannot provide accessibility in third-party applications. When done you can use the Back button to return to The vOICe for Android.
—> Support for using Google Goggles
Touching the far right edge of the screen launches the Google Goggles machine vision app, if installed, or finds it on Google Play to let you install it. As an alternative for tapping the right screen edge you can also tap the microphone image at the bottom left of the screen and say «right edge». You can use the Back button to return to The vOICe for Android.
Some visual patterns such as barcodes are recognized automatically by Google Goggles, but for most other patterns you need to explicitly request a recognition attempt by pressing the center d-pad key or equivalent (as an alternative for the inaccessible graphical button of Google Goggles near the right screen edge at mid height). Sometimes you will not properly hear or understand what Google Goggles recognized and speaks via your screen reader. A useful trick is then to press the menu key twice, and you will get the recognition result repeated. Accessible activation of another recognition attempt depends on the state that Google Goggles is in. If you just got a (failed) recognition attempt, pressing the menu key gives you three buttons for «Share results», «New search» and «Help» that you can navigate with your screen reader to activate «New search». After selecting «New search», you can again apply the center d-pad key to activate another recognition attempt. There is no need for using the Back button unless you want to leave Google Goggles and return to The vOICe.
Note: The vOICe for Android has no control over the accessibility of Google Goggles, and a tighter integration of the Google Goggles functionality may be possible once a Google Goggles API becomes available. Currently, Google Goggles is still partially inaccessible to blind users. Also, image recognition by Google Goggles often still performs poorly in typical human environments, but that might improve in the future through Google’s efforts in deep learning neural networks.
—> Support for your own favorite apps
In case you have your own favorite apps that you want to launch by tapping the left or right screen edge in The vOICe main screen, you can specify that via the «External apps» menu. In the resulting dialog you can enter two package names, which by default are set to those of the Eye-D image search engine and of Google Lookout. The package name of an app can be found by looking for the ID part in the URL of its corresponding Google Play web page. For example, to launch the Eye-D object recognition app from The vOICe, you enter its package name «in.gingermind.eyed» in the «External apps» menu (or in.gingermind.eyedpro for Eye-D Pro), to launch the Supersense object recognition app you enter «com.mediate.supersense», to launch the BlindTool object recognition app you enter «the.blindtool», to launch the Snapper image search app you enter «com.snapper», to launch the ZXing barcode scanner you enter»com.google.zxing.client.android», and to launch ABBYY TextGrabber you enter «com.abbyy.mobile.textgrabber.full». The package name of Text Fairy (OCR) is «com.renard.ocr», for Speak! (OCR) it is «com.toucan.speak», for CamFind it is «com.msearcher.camfind», for TapTapSee it is «com.msearcher.taptapsee.android», for AI Sight it is «com.vikrambajaj.aisight», for Aipoly it is «com.aipoly.vision», for Minerva it is «com.chcepe.minerva», for Chelsea Vision it is «com.ChelseaAILabs.ChelseaVision», for Envision AI it is «com.letsenvision.envisionai», and for Google Lookout it is «com.google.android.apps.accessibility.reveal». Once Microsoft Seeing AI becomes available for Android you will similarly be able to add it by its package name and launch it from within The vOICe whenever you need it.
Share your viewpoints
Pressing capital «P» on your keyboard or tapping the Share button on the screen shares the current camera preview image with an app of your choice, be it Twitter, WhatsApp, GMail, DropBox, Eye-D or whatever you want to use. If supported by the recipient app, the added text includes your current address location, but of course you can edit this out or modify it to your liking before submitting your preview image (and the preview image itself does not contain Exif time and location tags).
How to consult a remote sighted guide
If you want a sighted person to give verbal feedback on your live camera view, you can install Skype and after making a Skype connection to this sighted person you can activate Skype’s screen sharing option before starting The vOICe. This should let the sighted person see The vOICe main screen and let him or her talk with you about what is showing in the live camera view. Screen sharing provides a good workaround for the fact that only one app at a time can access the camera.
Support for sighted instructors
Sighted instructors who teach use of The vOICe to blind users can easily track on a second monitor what the blind user of The vOICe for Android is seeing/hearing. The simplest and most familiar way is to use a Google Chromecast device connected to the HDMI input of a television or external PC monitor, and use the Google Cast app to wirelessly screencast video and audio from the Android device. Stereo headphones can now be connected to the second monitor instead of to the Android device because audio on the Android device will be muted while screencasting. There exist numerous alternative screencasting apps for Android, but the Chromecast solution is among the easiest to set up. You can also globally livestream The vOICe for Android to your public YouTube channel using Google’s YouTube Gaming app, but note that this is not suitable for real-time instruction feedback because of a significant streaming delay (latency) of up to a minute.
Support for blind photography and third-party OCR
Some blind people like to take photographs to document situations much like other people do, or they apply it as an art form. The soundscapes of The vOICe can thus be used as a kind of auditory viewfinder to let blind photographers frame their subsequent high resolution photographs. Pressing the «p» key saves a high resolution photograph in JPEG format to the /vOICe folder (under the root of your storage device) /data/data/vOICe.vOICe/files/ internal (flash) memory folder of your Android device,—> with a UNIX timestamp in the filename, e.g., «vOICe_1155742184123.jpg». Moreover, provided that you have headphones connected, you will have after taking a photograph hear a beep and then have a few seconds to speak a descriptive label for your photograph, which is then through speech recognition included as plain text in the photo log file /vOICe/photolog.txt . If Exif tags are enabled (default is enabled in the «Other settings» menu), it is also stored as an Exif image description tag in your photo file such that it can never be lost. Furthermore, the label is copied to the system clipboard for optional use in other apps. The skin color filter toggled by key «s» may be used to help frame a face in portrait photography, while the tactile graphics mode provides an alternative means for checking the view composition. More information on blind photography can be found at the Blind with Camera School of Photography.
Stored photographs show up in the standard Android Gallery under «vOICe». By default, Exif time and location tags (geotags) are also embedded in the saved photographs, just as with Android’s built-in camera app, but you can disable Exif tagging via the «Other settings» menu. Stored photographs may also be used as input for third-party OCR engines to recognize text in images. There exist numerous commercial OCR applications for Android, including for instance Scanthing OCR and ABBYY TextGrabber + Translator.
The vOICe for Android has built-in support for a number of languages, such that you may be able to use The vOICe in your native language. Currently supported languages are English, German, French, Spanish, Italian, Portuguese, Dutch, Russian, Turkish, Hungarian, Slovak, Estonian, (Simplified) Chinese, and Korean. You can set the language of your preference via the Language dialog in the Options menu. The Text-to-Speech (TTS) engine that ships with the Android platform already supports a number of languages, such that The vOICe may speak your language of choice without requiring separate installations. For other languages or different voices you can buy TTS engines on Google Play.
For example, to buy and install the Russian female Katja TTS from SVOX, first install the SVOX Classic TTS engine, and next buy and install the SVOX Russian Katja Voice. Similarly, in case of the Dutch female Lena Voice you would instead buy and install the SVOX Dutch Lena Voice, for Chinese Mandarin female the SVOX Mandarin Yun Voice, for Chinese Cantonese female the SVOX Cantonese Hei Wan Voice, and for Korean female the SVOX Korean Sang-Mi Voice. Next, go to the phone menu Settings | Voice input & output | Text-to-speech settings, and check the SVOX Classic TTS checkbox, and via the SVOX Classic TTS entry underneath install the Russian Katja voice (or the Dutch Lena, Chinese Yun or Hei Wan, or Korean Sang-Mi voice), because SVOX will state that it is not yet installed. Now when you run The vOICe for Android and in Menu | Language select Russian (or Dutch, Chinese, Korean), The vOICe will speak with a Russian (or Dutch, Chinese, Korean) voice. The vOICe for Android automatically switches among the available TTS engines in SVOX Pico TTS and Classic TTS to try and find one that matches your selected language in The vOICe.
Alternatively, you can install eSpeak for Android voice from Google Play. Its speech quality is not as good, but it is free and supports many languages. For the eSpeak alternative you should after installation enable it in phone menu Settings | Voice input & output | Text-to-speech settings, and optionally set it as your Default Engine. Beware that at the time of this writing, one has to manually select the speech language in the eSpeak preferences, because it does not let The vOICe switch speech language dynamically. Currently, eSpeak also fails to speak non-latin characters such as Chinese, saying only «symbol, symbol, symbol», and so on.
Beware: to let The vOICe change voices in accordance with your language selections, you must leave the «Always use my settings» checkbox unchecked in the phone menu Settings | Voice input & output | Text-to-speech settings.
—> The «Single language» checkbox found in the «Other settings» menu, when unchecked, lets The vOICe switch languages while speaking the current location, using the country’s native language for street names and the user-selected language for everything else. By default the checkbox is checked, meaning that no language switching is used while speaking locations. Note also that switching languages on-the-fly via the «Language» entry in the main menu is only supported on Android 2.2 (Froyo) and later, while it currently fails with eSpeak. With earlier Android versions, The vOICe will attempt to follow the language as set by the user in the general phone Settings, and The vOICe’s «Language» menu entry and «Single language» checkbox are not available.
Credit: the app translation contributions of Soumaya (French), AccessAna (Spanish), Domenico Milano (Italian), Tiago Henriques (Portuguese), Radion Mynayev (Russian), Yusuf Sarıgöz (Turkish), Tamás Géczy (Hungarian), Miroslav Miškuf (Slovak), Artur Räpp (Estonian), Bin Xia (Chinese) and Fadoua Mayara (Arabic) were greatly appreciated. In addition, we are grateful for translations of the description in Google Play by Melanie Forster (German), Soumaya (French), AccessAna (Spanish), Angelo Gargantini (Italian), Tiago Henriques (Portuguese), Sergey Ershov (Russian), Yusuf Sarıgöz (Turkish), Tamás Géczy (Hungarian), Miroslav Miškuf (Slovak), Joanna Zieniuk (Polish), Bin Xia (Chinese) and Fadoua Mayara (Arabic). Any remaining errors are likely due to supplementary use of automatic translation tools and should not be attributed to the translators.
It is intended to also let The vOICe for Android run inside your Google TV set, sounding either the visuals of your currently selected television channel or the live view of a built-in camera if present. It is not yet known if Google will let Android applications grab and process video frames in Google TV sets, as would be required for using The vOICe.
—> Now what if it is raining?
In order to be able to use The vOICe for Android in adverse weather conditions such as heavy rain, you may consider using a waterproof MP3 player bag (eBay shop link).
Don’t get mugged
In all uses, please stay aware that pointing the phone’s camera at people who do not know you or The vOICe, in public places or elsewhere, might trigger hostile reactions, for instance because people may think that you are taking their photograph without their permission or otherwise invading their privacy. Similar issues may apply when pointing the camera at certain properties.
Stereo vision and 3D video? Not yet.
The vOICe for Android does not yet support stereo vision based 3D vision and depth mapping for obstacle detection and avoidance like The vOICe for Windows does on the PC, but the option will be considered once stereo vision becomes widely available on the market. Alternatively, The vOICe for Android may in the future make use of time-of-flight (TOF) image sensors and/or real-time monocular SLAM (simultaneous localization and mapping) for depth mapping.
The vOICe for iPhone, iOS?
The vOICe Web App is a platform-independent version of The vOICe that runs on the iPhone in Safari on iOS 11 and later.
The «autostart» option in the «Other settings» menu makes The vOICe launch automatically after a reboot. On known smartglasses this option is default ON and on other devices default OFF. The autostart option is meant to serve blind users of smart glasses that are meant mainly for use of The vOICe and that are otherwise difficult to operate due to lack of a screen reader, text-to-speech support, or because of inconvenient controls.
If you need to disable the touch screen while running The vOICe, you can make use of the Argotronic Touch Blocker / Disable Touch app, configuring it to for instance unlock upon using the volume key, and while running The vOICe you can do a gentle sweep-down at the top of the screen to access the notification bar and activate Touch Blocker.
Permission
Purpose
Camera
Live visual input from camera
Wake lock
Keep app running hands-free
Access fine/coarse location
Talking locator
Internet
Talking locator, Google Analytics, ads
Access network state
Google Analytics
Access WiFi state
Display WiFi info on main screen
Write external storage
Save images, soundscapes
Modify audio settings
Keep lowest audio volume above zero
Vibrate
Vibration, tactile graphics display
Record audio
Dialog-less voice command input
Run at startup
Autostart option for smartglasses
We use device identifiers to personalise content and ads, to provide social media features and to analyse our traffic. We also share such identifiers and other information from your device with our social media, advertising and analytics partners.
See also the privacy policy, disclaimers and terms of service (TOS).
There is a free screen reader from Google available for Android, named TalkBack, available from Google Play if not already pre-installed on your device. Once installed, TalkBack can be enabled via the Accessibility section of the Settings menu. The vOICe automatically reduces its self-voicing when an active screen reader is detected, to prevent that menu and dialog elements are spoken twice. Most Android phones are now fully accessible to blind users without a need to buy third-party software, but for less tech-savvy phone users there is also a non-free screen reader as part of Code Factory’s Mobile Accessibility tool suite for Android. The Project RAY and Georgie apps aim to offer simplified smartphone accessibility for pre-defined functionality, at a price. One can leave Georgie using the Back button after going to Settings, User Level, and selecting Expert Mode. Note that simplified but more restricted access to a small subset of Android apps may prevent or complicate blind access to The vOICe for Android and many other fully accessible Android apps, and even potentially stigmatize the blind user as being incapable of using a regular smartphone with a normal visual appearance and built-in general accessibility.
For alternative speech engines, you can consider installing voices from SVOX.
The vOICe for Android is prepared to take advantage of multicore CPUs for improved performance through parallel processing. The number of active CPU cores is indicated by the number of pips on a die face symbol at the top of the screen: a single-core processor shows as a small square containing a single dot ⚀ , whereas a dual-core processor shows a square with 2 dots ⚁ and a quad-core processor shows a square with 4 dots ⚃ . Beyond 6 cores a number indication is used instead of a die face. The CPU load only indicates the percentage of time spent in image processing and soundscape synthesis, and does not include video capture and screen rendering.
The vOICe for Android requests a number of permissions upon installation. The table on the right lists what these permissions are used for. Note that Internet access is used only for Google based services such as the Google Maps based talking locator, Google Analytics for anonymous user statistics (The vOICe itself does not track any user behavior), and ad delivery by Google Admob (currenty no ads are shown on smart glasses or when a screen reader is running). No audio is recorded by The vOICe itself, but the «record audio» permission is nevertheless required for dialog-less Google speech recognition as used for voice command input.
Google is working on Google Goggles, targeting the possibility of using a camera phone or equivalent to recognize objects and texts from the environment in order to search for information (Google Visual Search). This may in due course complement The vOICe for Android, for instance by directly linking Goggles’ OCR part to Google’s TalkBack screen reader for speaking any text in the camera view. More on this in the April 2010 techradar article Google: we plan to open up our Goggles platform (Shailesh Nalawadi). Another app for launching Google Goggles and speaking recognition results is Talking Goggles by sparklingapps.
Other Android-based augmented reality (AR) applications include Wikitude from mobilizy.com and Layar from SPRXMobile. Related Android freeware includes Tactile Mobile by iFeelPixel Association, the program «Blind» by José Villena, the Bluetooth-based Talking-Points talking beacons project for city navitation by Peter Kretschman et al. of the University of Michigan, and the Barcode Scanner by the ZXing Team. For more on Android accessibility, as needed also for blind navigation of The vOICe for Android menus, read the Open Letter Initiative of Per Busch.
On phones that lack a physical keyboard, or when the physical keyboard is hidden, a keyboard icon at the bottom right of the landscape screen view of The vOICe lets you pop up the soft keyboard (virtual keyboard). A known Android issue is that the soft keyboard may fail after launching a third-party activity from The vOICe (such as for barcode reading or an ad). The workaround is then to visit the Options menu with the menu key, after which the icon-activated soft keyboard will work again. To access the Options menu you can also press the bottom middle of the screen. This provides an alternative if you do not have a physical menu key nor an on-screen Options menu button (which happens if The vOICe is running in full screen mode, also known as immersive mode, a setting in its «Other settings» menu).
Using the TalkBack soft keyboard: Google’s TalkBack screen reader contains an accessible soft keyboard for blind users, or if it is not included you can get it separately as the Eyes-Free keyboard. This keyboard can be used instead of The vOICe’s default soft keyboard, but unlike other keyboard it currently does not respond to tapping the bottom right corner of the main screen. Assuming that the Eyes-Free keyboard was in hidden mode, you can after launching The vOICe pop it up in typing mode through a long-press on the volume-down button. Then you can apply any of the key-press commands of The vOICe, such as lowercase ‘x’ to toggle the look-around locator mode. In case the Eyes-Free keyboard starts in uppercase (a quirk that has been observed on some devices), you can toggle it to lowercase by pressing the Shift key twice. To hide the Eyes-Free keyboard again, long-press the volume-up key. Then you can apply any of the touch screen commands of The vOICe, such as double-tapping the middle of the screen to toggle mute. You can also press the phone’s power button to turn off both the screen and the camera.
our recommended accessible soft keyboard for blind use of The vOICe. —>
The vOICe for Android does not yet run on Android ports to Raspberry Pi 3 (issue tracker), such as Android Things or RTAndroid because of incompatibilities in the camera HAL (hardware abstraction layer).
The ZGPAX S5 smartwatch phone firmware wrongly pretends to have a back-facing camera whereas the device physically has a front-facing camera. For blind users this gives a wrong right-to-left soundscape scanning direction when pointing the camera away from the user, and they should set the RTL (right-to-left) soundscape scanning option in the «Other settings» menu to compensate for this bug. Beware also that the ZGPAX S5 can reportedly run very hot.
To help protect the ZGPAX S5, Google Glass and other devices against overheating, The vOICe for Android automatically quits when the battery temperature gets above 65 degrees Celsius (149F) or the CPU temperature gets above 85 degrees Celsius (185F), provided that the device supports the underlying temperature sensing. Google Glass may itself also issue «Glass must cool down to run smoothly» messages if The vOICe’s protection does not kick in before that. Indeed Google Glass becomes very slow as it runs hot, by throttling its CPU clock frequency down in order to lower power dissipation.
A number of HTC phones are affected by a bug where the soft keyboard can get stuck in uppercase mode, and pressing the Shift key does not help to toggle between uppercase and lowercase. This makes that many of the lowercase keyboard shortcuts of The vOICe cannot be used. This is not a bug in The vOICe but a bug in the HTC Sense interface, and a workaround is to install and select a third-party keyboard app such as the Hacker’s Keyboard (which includes left/right/up/down arrow keys), SwiftKey 3 Keyboard or GO Keyboard. These keyboards do not get stuck in uppercase mode. For some of the keyboards you need to go into their settings to enable the display of arrow keys. Also note that these soft keyboards need not be accessible to blind users. Some keyboards can also be quite slow to appear and disappear, and this is also not a caused by The vOICe. One of the quickest keyboards to appear and disappear under The vOICe is the Google Keyboard.
Switching among different speech engines on-the-fly from within The vOICe for Android appears erratic under Android 4.x, and methods such as setEngineByPackageName() that allowed for quickly switching speech engines under Android 2.x and that worked quite well have unfortunately been deprecated by Google. In general, TTS support under Android 4.x appears less mature than it was under Android 2.3.3. If you want to mostly work with a particular speech engine, this is best set in the general phone settings (i.e., outside The vOICe).
—> —> License restrictions: You may freely and without cost apply The vOICe for Android for personal or academic use. You will not charge a fee or request donations for The vOICe for Android, nor distribute or include The vOICe for Android in or with commercial products, nor modify or reverse engineer The vOICe for Android. You will not use The vOICe for Android for commercial purposes.
—> Disclaimer: THIS PRODUCT (The vOICe for Android) IS PROVIDED AS IS WITHOUT ANY WARRANTY OF ANY KIND. TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, PETER B.L. MEIJER FURTHER DISCLAIMS ALL WARRANTIES, INCLUDING WITHOUT LIMITATION ANY IMPLIED OR STATED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NONINFRINGEMENT. THE ENTIRE RISK ARISING OUT OF THE USE OR PERFORMANCE OF THIS PRODUCT AND DOCUMENTATION REMAINS WITH RECIPIENT. TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, IN NO EVENT SHALL PETER B.L. MEIJER OR HIS SUPPLIERS BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL, DIRECT, INDIRECT, SPECIAL, PUNITIVE, RECURSIVE, OR OTHER DAMAGES WHATSOEVER (INCLUDING, WITHOUT LIMITATION, DAMAGES FOR LOSS OF BUSINESS PROFITS, BUSINESS INTERRUPTION, LOSS OF BUSINESS INFORMATION, PERSONAL INJURY, DISRUPTION OF FAMILY LIFE, OR OTHER PECUNIARY LOSS) ARISING OUT OF THIS AGREEMENT OR THE USE OF OR INABILITY TO USE THE PRODUCT, EVEN IF PETER B.L. MEIJER HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. BECAUSE SOME STATES/JURISDICTIONS DO NOT ALLOW THE EXCLUSION OR LIMITATION OF LIABILITY FOR CONSEQUENTIAL OR INCIDENTAL DAMAGES, THE ABOVE LIMITATION MAY NOT APPLY TO THE RECIPIENT. LET IT BE KNOWN THAT THE USER OF THIS PROGRAM HAS PAID PETER B.L. MEIJER THE SUM TOTAL OF $0.00 FOR THE USE OF THIS PROGRAM. ANY DAMAGES AWARDED SHALL NOT BE IN EXCESS OF SAID AMOUNT.
Supplemental information associated with disclaimer: