Find VC dimension

I'm studying theoretical machine learning at university, and I have this problem in textbook, that I have no Idea how to start. In space $X=R^2$ are given two models $H_1$ (rectangle with sides parallel to the coordinate axes) and $H_2$ (lines). We define a model $H_3$ such that every hypothesis is a combination of one hypothesis from $H_1$ and one from $H_2$. Prove or disprove that the model $H_3$ has a bigger VC dimension (Vapnik–Chervonenkis dimension) then the VC dimensions of the models $H_1$ and $H_2$.

I need to find VC of $H_1$, $H_2$, and combined of $H_3$.

Topic vc-theory machine-learning-model machine-learning

Category Data Science

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.