Group Group Group Group Group Group Group Group Group

Errata for Android Accessibility by Tutorials, 1st ed

Creating this topic to catch any typos or bugs in the 1st Edition of Android Accessibility by Tutorials.

Hi!

I think this is the correct place to post this (I hope it is).

In chapter 5, section 1…when attaching the OnPageChangeCallback, the following paragraph says:

When it does change, it sets the Next button text if the user is on the last page, and it sets the Done button visibility depending on if it’s the first page. Lastly, when the user is on the first page, it hides the Back button.

But that doesn’t seem right, right? The Next button text changes on the last page, that’s ok. The Back visibility button changes on the first page, that’s ok, too. But what’s “sets the Done button visibility depending on if it’s the first page”? Isn’t the Done button (which is the same as the Next button) always visible?

Am I missing something? Cheers.

1 Like

Yes, this is the place! Thank you :]

You’re right that the wording is a little bit confusing. That button is always visible, we just change the text between “Done” and “Next” depending on what page it’s on.

1 Like

I read this book,and have some ideas and notes,as blind developer of android apps.

Custom links clicking problem

in chapter,connected with links,you not wrote anything about clicking on the links. But this зфке in my opinion is very important. For example (it’s a real example) textview have links with theme exec. Sighted people click on it using ontouch method of LinkMovementMethod class. But how click on this link to blind people? In links menue of talkback we have this link,but after clicking on it nothing happened. Star note: now we will tell about links in usual views,because in webview framework make accessibility of links itself.

Solution of this problem

in manifest you should declare in activity,which open this links:

<intent-filter>
<action android:name="android.intent.action.VIEW"/>
<data android:scheme="exec"/>
<data android:scheme="Exec"/>
<data android:scheme="EXEC"/>
                <category android:name="android.intent.category.DEFAULT"/>
</intent-filter>

If your view with links is in the same activity,which will handle your custom links,you should also add android:launchMode=“singleTask” or any other value,which allow to activity call onNewIntent method.

After that you should override onNewIntent or/and get intent from onCreate method (if your activity run first time with action view,or if onNewIntent will never call in your activity). You can get link,using intent.getData(). Star note: your activity must have exported=true in manifest,else talkback,at list latest version,will crashed,i.e blind users will reboot there devices or ask sighted people,to they pressed ok in system dialog,which announce,what talkback application was crashed,to reboot talkback. Google,please,fix this issue of your screenreader,because it’s not good,when screenreader crash.

Another solution of this problem

Not all screenreaders recognize links in usual text,for example jieshuo screenreadr it seems can’t do this. Also it not so quickly go to menue of talkback,find links item and find need link. More faster in my opinion find links,using swipes. For This we should add virtual views. In Your book you mentioned exploreByTouchHelper,but we can use for this AccessibilityNodeProvider class and onDispatchHoverEvent method of our custom view,to focus our links,when we use explore by touch,instead of swipes. For our virtual views (accessibilityNodeInfo) we should have at least action accessibility_focus and/or focus and action click. As for me,i have a problem on my old device with android 4.4.2 (api 19),when action click not call,but in my new device with android 11 all ok. Probably it’s sdk issue,but may be not,i am not sure. For each virtual view we also should set correct coordinates (myAccessibilityNodeInfo.setBoundsInScreen(some_rect)). If coordinates will incorrect,talkback and probably other screenreader whill ignore this view,such as for info.setImportantForAccessibility(false) or if node invisible for user. I suppose,may be it’s wrong,what action_click on api 19 not works because of coordinate problem in my code,because i dont know,how to get coordinates for each link. You can ignore setBoundsInParent method,and you will haven’t any crash,about which you wrote,when you wrote about ExploreByTouchHelper class. You also should set parent and source for your virtual view (info.setParent(myView.this) and info.setSource(myView.this,virtualViewId) (if you use accessibilityNodeInfo.obtain method you don’t have to setSource,because it will works even without it )). I hope very much,what you,as author of this book will add all,what i write now in your next edition,and will find answer on question about coordinates for each link.

AccessibilityActions

In your book you wrote about accessibilityActions,but in my opinion you haven’t paid enough attention to this topic. Thanks actions screenreader can know,whether view collapsed or expanded. Also we can work with granularity of the view. Star note: it strange,but if virtual view,created with AccessibilityNodeProvider have text but haven’t contentDescription,talkback not moves by characters,words,lines,paragraphs,etc. Probably for node should be added editional actions. Also you not tell anything about custom actions,which we can use,if we will find actions item in menue of talkback,or execute gesture,if we assign it on this command. But this actions will not be shown and this item will not appeared,if id of action <=0x01FFFFFF (see method isCustomAction method in class accessibilityNodeInfoUtils in sources of talkback on github). So,ids of custom of accessibilityAction should started from 0x01FFFFFF+1.

It’s exists issue in ExpandableListView,when talkback not announce,whether item of list is collapsed or expanded. Yes,custom view in our adapter partially solve this problem,but method performClick of our custom view not call during of the clicking on list item,so talkback,and probably other screenreader,not announce during the clicking on list item,whether view is collapsed or expanded,but using swipes and/or explore by touch we can hear it.

Scrolling problems

In some apps,e.g official telegram for android,source of which you can find on github,exist problem. When we press jump to end button,or like this,to move to last message,and swipe left,automatic scrolling of list not works in talkback,so we should use two fingers up/down gesture to scroll list back/forward. If we do this swipe very long or probably quickly (i am not sure) we can skip some messages. The same scroll problem exists in viber,but not in case,when we move to last message.

The second scroll problem is when in some list we have an item (message) whith alot of text e.g 2000 characters with huge font or so on. In this case not only list have scroll,but we have scroll in view with this large text. In this case we should do swipe very quickly,because wi will hear only scrolling sound from talkback or,probably,any other screenreader or,such in viber,we will move by views of item around,and can’t move to next item without two finger up/down swipe. Hear under the list I mean RecyclerView,at least for telegram app. Also it exists problem,when textView have alot of text,and we read this text by characters,words,lines,paragraphs etc,after scrolling of the text during of this movement,focus clear from this view,so we can’t continue to move between pieces of text.

roleDescription,stateDescription,heading and other notes.

When you wrote in your book about roleDescription,you not mention,what we can not only use roleDescription,but override method getAccessibilityClassName in custom view or directly set class name to accessibilityNodeInfo. In This case talkback or probably other screenreader will announce type of element,for example,as button,if class will be as android.widget.Button. Also in android 11 it’s added new attribute and methods for it,i mean stateDescription. I hope you will write about it more details in your next future edition,as,for example,about paneTitle and tooltipText.

In your book you wrote,what accessibilityHeading works from android api 28. You are wrong,because accessibilityNodeInfoCompat allows to do it from api 19. Also method of class collectionInfo or collectionItemInfo allows to do this,but i hope you will write about this and about this two classes in next edition,because thanks to this classes,we can set this node as list item,line and row of table,etc. Also i hope,you will write about RangeInfo class and about accessibility actions,which allow to change value of rangeInfo. In my opinion,it especially actual for custom seekbars,because,for example in whatsapp and some other apps exists seekbars,which impossible move with screenreader,using volume keys,or other features of screenreaders for seekbars. So,we should drag finger from it,to change it value,what not always so easy.

Also it exists tts problem,which you not described in you book. I mean when we,for example,have list and click on it item and list changed and in this moment we speak some message via TextToSpeech class,the message will be interrupt by talkback,because talkback will announce about changes of the list. I am not sure,but it seems,what the same problem exists when we use announceForAccessibility method of view class. I not found way for solving of this problem. Yes,we can set sleep before speaking of something,but we can’t guess exact sleep before speaking,and large sleep,in my opinion,is not good. Also it seems,what in your book you will not enough pay attention to sending of accessibilityEvents. For example instead of accessibilityLiveRegion for speaking of information we can use this code:

public void speak(Context context,String text) {
AccessibilityManager manager =(AccessibilityManager) context.getSystemService(Context.ACCESSIBILITY_SERVICE);
AccessibilityEvent event =new AccessibilityEvent(AccessibilityEvent.TYPE_ANNOUNCEMENT);
event.getText().add(text);
manager.sendAccessibilityEvent(event);
}

Also we must send accessibilityEvent to focus on virtualViews in accessibilityNoteProvider if we will get accessibility_focus action (see above),else talkback and probably other screenreader will ignore this node,such as it will be for incorrect coordinates of node. We should return true in method performAccessibilityAction for actions accessibility_focus and focus. Using this way,we can,in my opinion,develop algorithm traversing between child nodes and parent node,without setTraversingbefore and setTraversingAfter methods.

contentDescription of menu button

If application have menu,sighted people see three dots,afer pressing on it menu shows on the screen. But for blind people this button have probably contentDescription,which have different names on differents devices. But it would be very good,if we can get this view,to change contentDescription of it. i even reported issue to google,but may be you find way to get this view,to change it contentDescription to more understandable.

Thank you very much for your book,and i will look forward of your next edition.

My contacts

If you want,you can communicate with me,using e-mail, telegram or skype with the same login ssashablr_1994.

Good luck to you.

Sorry,but my e-mail not insert as a link,probably issue on this site,so my e-mail k.sasha1994@yandex.ru.

@sashakozlovskiy Thank you so much for those suggestions! You mentioned many important things that I’ve definitely taken note of.

1 Like

@vgonda Also i want to ask you in the future edition give more direct link on this topic. You.

You can also see this issue. In this issue i mentioned about action show_on_screen,which should be handled too.