Looking at the wm8960 driver it appears that there are two places it will return the ‘failed to configure clock’ in wm8960_configure_clocking() (which is not very helpful). Can you check exactly where it is failing? Looking more at the driver I am wondering if it is failing because wm8960->freq_in is not set correctly. Looks like this is set by wm8960_set_dai_pll() and maybe we are missing a call to snd_soc_dai_set_pll(). Maybe check the raspberrypi driver to see what it calls to initiaise the codec.
It seems the problem is that lrclk and bclk are zero. They are initialized in the wm8960_hw_params() function, so I also added some debug outputs there. The result is the following:
As you can see, the wm8960_hw_params() is called after the wm8960_configure_clocking() which causes the failure of the clock configuration. When I try to play the file again, the “failed to configure clock” error message disappears:
Now with regard to the other errors, I am still not certain we have the clocking configured right. I had a look at the schematics for the ‘ReSpeaker 2-Mics Pi HAT’ and it does show a 24MHz oscillator. Also I have a look at another driver, sound/soc/fsl/fsl-asoc-card.c, that uses this codec. I think that what we need to do is something like …
Looking at the wm8960 driver used for the re-speaker, looks like they hard-coded the 24MHz …
We should not need to do that if we configure the PLL correctly. I still do not understand who is calling wm8960_configure_clocking() when it fails the first time. Seems that it should be called by the codec’s hw_params() function or in the codec probe (which would happen early on). Do you know who is calling this when bclk and lrclk are zero? I guess you could add a ‘dump_stack()’ that would show you the call stack when it is called.
The 12288000 in device-tree does not make sense to me. You may wish to check with the vendor of the board if it really is 24MHz or not.
Looking at the wm8960 data sheet it can accept an MCLK of upto 33MHz and so 24MHz is certainly valid.
Looking at the wm8960, I can see that the reason why configuring the PLL now fails is because the ratio between the MCLK and output clock is not large enough. Furthermore reading the wm8960 datasheet it states that the PLL performs best when the output frequency is between 90MHz and 100MHz. And ‘Table 45 PLL Frequency Examples’ in the datasheet shows that with a 24MHz MCLK, we want the output frequency of the PLL to be 98.304MHz and then divide this down to 12.288MHz. So from looking at the datasheet something like …
diff --git a/sound/soc/tegra-alt/tegra_t186ref_mobile_rt565x.c b/sound/soc/tegra-alt/tegra_t186ref_mobile_rt565x.c
index 36572f1a9288..ad8b3f9e3000 100644
--- a/sound/soc/tegra-alt/tegra_t186ref_mobile_rt565x.c
+++ b/sound/soc/tegra-alt/tegra_t186ref_mobile_rt565x.c
@@ -44,6 +44,7 @@
#include <sound/pcm_params.h>
#include <sound/soc.h>
#include "../codecs/rt5659.h"
+#include "../codecs/wm8960.h"
#include "tegra_asoc_utils_alt.h"
#include "tegra_asoc_machine_alt.h"
@@ -314,6 +315,69 @@ static int tegra_t186ref_dai_init(struct snd_soc_pcm_runtime *rtd,
}
}
+ idx = tegra_machine_get_codec_dai_link_idx_t18x("wm8960-playback");
+ /* check if idx has valid number */
+ if (idx != -EINVAL) {
+ unsigned int pll_f2;
+ int sysclk_div
+
+ dai_params =
+ (struct snd_soc_pcm_stream *)card->rtd[idx].dai_link->params;
+
+ dai_params->rate_min = clk_rate;
+ dai_params->formats = formats;
+
+ /*
+ * Per the wm8960 data-sheet the PLL performs best in the range of
+ * 90-100MHz, therefore set the PLL to operate as close to 100MHz
+ * as we can.
+ */
+ pll_f2 = (100000000/clk_out_rate) * clk_out_rate;
+
+ if (pll_f2 < 90000000 || pll_f2 > 100000000)
+ dev_warn(dev, "PLL frequency is not optimal: %dHz\n", pll_f2);
+
+ err = snd_soc_dai_set_pll(card->rtd[idx].codec_dai,
+ WM8960_SYSCLK_PLL, WM8960_SYSCLK_MCLK,
+ 24000000, pll_f2);
+ if (err) {
+ dev_err(dev, "failed to configure codec PLL: %d\n", err);
+ return err;
+ }
+
+ /*
+ * The wm8960 has a fixed divide-by-4 divider on the output of
+ * the PLL and so take this into account when calculating the
+ * sysclk divider.
+ */
+ sysclk_div = (pll_f2/(clk_out_rate * 4));
+
+ if (sysclk_div == 0) {
+ dev_err(dev, "codec sysclk frequency is too slow!\n");
+ return -EINVAL;
+ }
+
+ sysclk_div = sysclk_div == 1 ? 0 : 2;
+
+ err = snd_soc_dai_set_clkdiv(card->rtd[idx].codec_dai,
+ WM8960_SYSCLKDIV, sysclk_div);
+ if (err) {
+ dev_err(dev,
+ "failed to configure codec clock divider: %d\n",
+ err);
+ return err;
+ }
+
+ err = snd_soc_dai_set_sysclk(card->rtd[idx].codec_dai,
+ WM8960_SYSCLK_PLL, (pll_f2/(sysclk_div *4)),
+ SND_SOC_CLOCK_IN);
+ if (err < 0) {
+ dev_err(card->dev, "codec_dai clock not set\n");
+ return err;
+ }
+ }
+
Please note that I am not familiar with this codec and so it is a bit of guess work. However, I don’t see any reason why this cannot work.
With regard to the Tegra I2S error, maybe we should ensure that the word clock (lrclk) and bit clock frequencies being generated by the codec match what we would expect first. It is important to get the clock configuration right first.
The sysclk seems to be ok regarding table 45 in the datasheet. I probed the bit clock and got a frequency of 3.072MHz which is two times the bit clock from the debug output. That’s wrong I think, isn’t it?
Also, I can not see the lrclk. The signal is continously high.
This was my fault. I probed the wrong pin. I can see the lrclk now, its frequency is 96kHz. It should be 48kHz I guess. So the frequencies of the probed clocks are two times the expected ones. Because the clocks are generated from the sysclk, I think the sysclk is already wrong.
Yes I may not have had the sysclk dividers quite right, it was a bit of guess work. There is a good diagram in the codec data sheet showing the clock dividers, etc and so you may wish to use this to figure out the correct configuration.
I finally got the clocks working, lrclk is now 48kHz and bclk 1.536MHz. This is the code I used:
idx = tegra_machine_get_codec_dai_link_idx_t18x("wm8960-playback");
/* check if idx has valid number */
if (idx != -EINVAL) {
unsigned int pll_f2;
//int sysclk_div;
dai_params =
(struct snd_soc_pcm_stream *)card->rtd[idx].dai_link->params;
dai_params->rate_min = clk_rate;
dai_params->formats = formats;
/*
* Per the wm8960 data-sheet the PLL performs best in the range of
* 90-100MHz, therefore set the PLL to operate as close to 100MHz
* as we can.
*/
pll_f2 = (100000000/clk_out_rate) * clk_out_rate;
if (pll_f2 < 90000000 || pll_f2 > 100000000)
dev_warn(card->dev, "PLL frequency is not optimal: %dHz\n", pll_f2);
err = snd_soc_dai_set_sysclk(card->rtd[idx].codec_dai,
WM8960_SYSCLK_PLL, (pll_f2/4),
SND_SOC_CLOCK_IN);
if (err < 0) {
dev_err(card->dev, "codec_dai clock not set\n");
return err;
}
err = snd_soc_dai_set_pll(card->rtd[idx].codec_dai,
WM8960_SYSCLK_PLL, WM8960_SYSCLK_MCLK,
24000000, pll_f2/4);
if (err) {
dev_err(card->dev, "failed to configure codec PLL: %d\n", err);
return err;
}
}
I could remove the part where you set the sysclk_div, because it is configured inside the wm8960_configure_clocking() function. Also I had to set the sysclk to pll_f2/4. Otherwise the configuration of the clock dividers is wrong. I don’t know why it is implemented in that way, maybe I am still using it wrong. Nevertheless, the clocks look ok now. This is the output:
Now, the problem is that I cannot see any audio data transferred between codec and Jetson. It doesn’t matter if I try to play or record something. Any idea?
Also I still have the issue with the order of the codec initialization. The function wm8960_hw_params() has to be called before the wm8960_configure_clocking(). Otherwise, bclk and lrclk are not initialized and the playback fails at the first try. How is it possible to change this?
With regard to the ordering issue, I wonder if we need to configure the sysclk and pll etc, earlier. Can you try moving the code into a init function for the wm8960 …
@@ -523,6 +586,11 @@ static int tegra_t186ref_compr_set_params(struct snd_compr_stream *cstream)
}
#endif
+static int tegra_t186ref_wm8960_init(struct snd_soc_pcm_runtime *rtd)
+{
+ ...
+}
+
static int tegra_t186ref_pcm1864_init(struct snd_soc_pcm_runtime *rtd)
{
int err;
@@ -973,6 +1041,9 @@ static void dai_link_setup(struct platform_device *pdev)
else if (strstr(tegra_t186ref_codec_links[i].name,
"pcm1864-link1"))
tegra_t186ref_codec_links[i].init = tegra_t186ref_pcm1864_init;
+ else if (strstr(tegra_t186ref_codec_links[i].name,
+ "wm8960-playback"))
+ tegra_t186ref_codec_links[i].init = tegra_t186ref_wm8960_init;
}
}
The result is equal in both cases: sysclk and lrclk are ok, but nothing happens on the data pins.
This does not work because the clk_out_rate is not available at this time. I think the wm8960_hw_params() has to be called before wm8960_configure_clocking(). Looking at the backtraces, it seems that they are not called from the machine driver (tegra_t186ref_mobile_rt565x.c):
Does this mean that we have to change it somewhere else or can we just include an additional function call inside the machine driver? I’m not really familiar with the linux audio subsystem, so I would appreciate if you could help me with this issue.
And by ‘equal’ you mean you still see the error message regarding the sw reset? If so then the I2S interface is still not seeing the bit clock. It definitely works because this has been tested (doing an external loopback between I2S interfaces on TX2). So the only thing I can think of is that the bit clock is not turning on quick enough. Can you insert a large mdelay (ie. 1 second will be more than enough but enough to prove if this is the issue) after the bitclock on the codec is enabled (probably in the codec hw_params function)?
Ah yes good point. So what I would do is this …
Move the code the sets up the codec clock to a new function that takes the clk_out_rate as a parameter (in addition to any other parameters you need to pass).
Call this new function from both the codec init function (here with a default clk_out_rate, for example 48000 * 256 = 12288000) and then from the machine driver hw_params (as you had before).
I don’t know which function activates the bitclock output. I checked different places inside the codec driver but could not find the right one. You think it should be the wm8960_hw_params()? I waited at the end of this function but the clock was not activated at this point. I also checked the other functions of the codec driver but could not find the one which is activating the clocks. Here are the function calls that I logged:
I can see the clocks only after the error message already occured. It seems that wm8960_mute is the last function of the codec driver but also this function does not activate the bitclock. Maybe something else activates the codec clocks? It’s weird.
I did this, but it gives me the same result. The thing is that wm8960_hw_params() is called too late. Due to this, the wm8960->bclk and wm8960->lrclk is not set. Do you know which function calls the codec’s hw_params()? Maybe I could try to change the order there?
Yes sounds like the problem is that the clocks are turned on too late. Probably the best thing to do is check the codec datasheet and see what bit in what register enables the bit-clock and then figure out in the driver where this is done.
Hmmm … indeed. I am not sure if the problem is that the codec is in some odd bias state on boot and assumes that when the set_bias function is called the bclk lrclk should have already been configured. Seems like a bug in the codec driver. I checked the latest version of the driver in the latest Linux kernel but seems to behave the same way. You may wish to ask the codec vendor about this.
Ok, I think I found the issue. The bitclock is activated after enabling the ADC or DAC. Both are activated through DAPM widgets. I enabled some debug output using this guide: Debugging DAPM for ASoC codec | Experiences
The result is the following:
As one can see, the I2S widgets are enabled before the codec widgets. Therefore the clocks are not available when the Jetson expect them. Is this a bug?
To confirm my assumption, I activated the DAC manually inside the wm8960_hw_params():
With this workaround, I can see the clocks on the oscilloscope earlier and the error message disappears.
I also can probe the audio data now. Unfortunately, I still can not hear anything when playing an audio file. Also recording does not work. Mic boost was activated and I could see some data on the scope.
I would appreciate if you can give me some hints for further debugging the issue.
OK, great. Do you see any userspace controls for these DACs?
amixer -c tegrasndt186ref controls | grep DAC
If so you could try enabling from userspace instead of the driver.
At first glance the routing seems fine as it appears to be routed to the headphones which you had before. I am not sure why the routing would be different between what you had before when the codec was the slave. Check for some ‘Headphone Playback Volume’, ‘LOUT1 PGA’ and ‘ROUT1 PGA’ controls. Make sure the PGAs are enabled.
So I think I still have to enable it from the driver. As workaround it would be ok for me. Nevertheless, I do not understand, why the Jetson stops waiting for the clock before the codec dapm widgets are enabled. Shouldn’t the I2S input of the Jetson wait for some time when it is configured in slave mode?
The headphone playback volume is ok. I also checked the other output controls using the alsamixer. Everything seems to be correct.
When I start playing an audio file after a reboot, I can hear it. Unfortunately it is played at wrong speed because of the wrong clock setting on the first run. When I try to play it again, the clocks are fine but I only hear some “clicking” noise. Do you have an idea?
It is during the enabling of the Tegra I2S interface (via the I2S DAPM widget) where the I2S interface is reset and we expect the I2S clock to be enabled at this point. So I am not sure that we can wait. Please note that I have tested the SGTL5000 (using the fe-pi audio Z V2 board) which works in the same way as being the I2S master and I have not had problems with it.
I can’t say that I do. However, I think that the wrong clock setting issue on the first run needs to be fixed as this should not happen and maybe the codec is not getting configured correctly. If you have a raspberrypi you could try testing the board with this and see how it is setting up the clocks. It is not clear to me if the codec is the master or slave with raspberrypi. I would have assumed it is the master, but I am not certain.