How Google Trends is normalized?
I have a daily series from Google Trends, using the range "today 3-m", but it comes that the last day is not available from this query. For example, today is March,24th and the last day using this query is March, 22th and I expected that it would be March, 23th. If I take the series using the range "now 7-d", the day comes hourly and there is March, 23th. I would like to aggregate it and put it in the same measure as the series I obtained before. For this purpose, I need to know how the series is normalized. I understood that each time I take a series (one region and one word), it simple divided each point by the largest number in the range and multiply it by 100. Hence, the maximum point of the series is 100. Using this hypothesis, if I sum all the indexes in the second series by day, each day has the same denominator (but different from the first series). But in this case, the growth between adjacents days it has to be the same in the two series. But it is not what happened. So, I did not understand the normalization. Could anyone help me, please?
Topic normalization google
Category Data Science