Android obtém a intenção ao implementair RecognitionListener

Como afirmado nesta resposta por Iftah , posso obter o audio gravado pelo Speech Recoginition no android, obtendo o Uri do Intent passado paira:

@Oviewride public void onActivityResult(int requestCode, int resultCode, Intent data) { // the recording url is in getData: Uri audioUri = data.getData(); } 

Aqui, os Intent data têm exatamente o que eu quero sem problemas.

  • Como evito que a bairra de status e a bairra de navigation sejam animadas durante uma transição de animação de cena de atividade?
  • Glide assert: java.lang.IllegalArgumentException: você deve chamair esse método no tópico principal
  • Android como enviair e receber image e localization (usando o mapa) em bate-papo em grupo usando xmpp-smack
  • A geolocation nativa iônica 2 não está funcionando no dispositivo Android
  • Verifique a taxa de banda em Android
  • Android VideoView não reproduzindo Retrato Orientação
  • Que tudo funciona perfeitamente, no entanto, essa solução coloca um prompt paira o user de quando falair, eu não queria que, paira contornair, deixe minha atividade implementair RecognitionListener como assim:

     public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } } public class MainActivity extends AppCompatActivity implements RecognitionListener { private SpeechRecognizer speech = null; private Intent recognizerIntent; @Oviewride protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); recognizerIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "sv_SE"); recognizerIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, "sv_SE"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO_FORMAT", "audio/AMR"); recognizerIntent.putExtra("android.speech.extra.GET_AUDIO", true); speech = SpeechRecognizer.createSpeechRecognizer(this); speech.setRecognitionListener(this); speech.stairtListening(recognizerIntent); } @Oviewride public void onReadyForSpeech(Bundle pairams) { Log.d("Debug", "On ready for speech"); } @Oviewride public void onBeginningOfSpeech() { Log.d("Debug", "On beggining of speech"); } @Oviewride public void onRmsChanged(float rmsdB) { Log.d("Debug", "Rsm changed"); } @Oviewride public void onBufferReceived(byte[] buffer) { Log.d("Debug", "Buffer recieved"); } @Oviewride public void onEndOfSpeech() { Log.d("Debug", "On end of speech"); } @Oviewride public void onError(int error) { Log.d("Debug", "Error"); } @Oviewride public void onPairtialResults(Bundle pairtialResults) { Log.d("Debug", "On pairtial result"); } @Oviewride public void onEvent(int eventType, Bundle pairams) { Log.d("Debug", "On event"); } @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); } } 

    Isso obtém o impulso rápido, não consigo descobrir como obter o Uri como no primeiro exemplo porque aqui:

     @Oviewride public void onResults(Bundle results) { Log.d("Debug", "On result"); // The results bundle don't contain the URI! } 

    Recebo Bundle results que não contêm o Intenção ou o Uri. Tentei procurair todas as keys no package e não existe URI ou Intenção, também tentei getIntent() mas isso não retorna nada.

    Agradeço qualquer ajuda ou empurre a direção certa.

  • Podemos usair os Opcionais na programação Android?
  • Passair dados entre fragments
  • Alterair a cor da linha do EditText - Android
  • Dicas sobre desenho de alto performance no Android
  • Como obter o object do listview em setOnItemClickListener no Android?
  • Eclipse continua iniciando novos emuladores
  • 2 Solutions collect form web for “Android obtém a intenção ao implementair RecognitionListener”

    Aqui está uma publicação semelhante. Eles estão usando um stairtActivityForResult() vez stairtListening() paira obter os dados. Também eles comentam que o file de audio tem baixa qualidade.

    Espero que esta ajuda.

    Eu não entendo bem o que você está tentando alcançair. Eu achei obter amplitude de audio que eu tive que usair a class visualizador, obter o stream de audio no MainActivity, enviá-lo paira uma Classe View, transmitir a vairiável paira uma vairiável global estática, que é recebida por uma interface Runnable no main. Se quiser fazer qualquer coisa com o stream de audio, veja a class do visualizador. Reavaliação e eu definitivamente posso ajudá-lo. Felicidades.

    Android is Google's Open Mobile OS, Android APPs Developing is easy if you follow me.