I looking for some movie name of American Indians.
On some TV channel i'm watching some movie about history of american indians. As i remember, the story was: American indians live perfect life, then Columb discoverd new American continent. The British were massively begun to settle on American continent. After that they make huge village. They make new politic and british army attack and killing indians, etc. The end of the storry is like that: All indians are realized, they have to work for money if wan't survive and their land was taken by politic - this is not their land anymore.
I am not Englihs, I never learn english so i hope you all understand what i'm saying.
So if you don't know about what movie i'm talking about, i just wan't some movie name about the history of American Indians.
On some TV channel i'm watching some movie about history of american indians. As i remember, the story was: American indians live perfect life, then Columb discoverd new American continent. The British were massively begun to settle on American continent. After that they make huge village. They make new politic and british army attack and killing indians, etc. The end of the storry is like that: All indians are realized, they have to work for money if wan't survive and their land was taken by politic - this is not their land anymore.
I am not Englihs, I never learn english so i hope you all understand what i'm saying.
So if you don't know about what movie i'm talking about, i just wan't some movie name about the history of American Indians.