10000 shots spent

Wow eydryan:hail: .... I wish I had the time to shoot that much!:hail: Dude what do you shoot?
 
This is why I prefer 5x4 and 10x8 - you do it in one (that's 100% success rate).
But my shooting rate depends upon what I'm shooting. When it's people you have to pop off a few rolls, but with still life or landscape - I only shoot 1 (3 at the most) but then I know what I'm doing and also what I'm trying to do.
If you find yourself shooting silly amounts then you might as well go the extra inch and buy a cine camera. ;)
 
heheh well i just integrate it into whatever i'm doing. these 300pics\day are with my class and such, mostly non-artistical stuff so it doesn't really take up much time...

i've had this window open for an hour now so i'll just hit quick reply and close it, apparently i have no time for it :(
 
eydryan said:
mercury my man you shoot quite little.

Especially considering you have a brand new camera! I took photos like crazy after getting mine!

eydryan said:
and about digital well yes it's evolving. it'll go where no man can even imagine. trust me on that one. it will be such a resolution that zoom lenses will be superflous, colours will mimick human retina perception and all that will fit in the corner of your eye or maybe as a contact lens, huh? :D film will die soon but not yet, now it's still quite strong with so many pros still shooting film. it's still a trend. but it'll pass. in 10 years everybody will have pocket compacts like the t7 capable of doing so much more and a photolab in every home.


Maybe in 100 or even 200 years. But not 10. Take a look at everything that was predicted for the year 2000. Most of it isn't even remotely close to reality, or even 'on the way'.

eydryan said:
and nobody will shoot film anymore except a few old people remebering the good days. it's sad but it's true i think. digital will win. like the otto engine over the steam engine...

Film will always be trendy. There will always be enthusiasts to keep it alive, not just old people, but college students and such.

p.s. how does everything somehow turn into film v. digital?? :confused:
 
100 years jadin? look at the rise of resolution and new systems in the past ten or twnty years. it's like from the old cameras to having a digital system capable of reproducing almost film-like resolution. and you see that all digital systems evolve rapidly and especially together. so it's actually a continued boom nowadays with technology. 30years top and no one will be using film (statistically i mean, sure there will be enthusiasts).

film will die that is my opinion and i keep it. maybe not in 10 maybe not in 20 hell maybe not even in a hundred but it will vanish. just think of everything else that evolved. the new stuff always takes over. and for remaining for everyone, i doubt it. maybe for some students yes. some nostalgics but that's about it. i mean, nowadays think. would you really want to go send a letter in the post wait 3 to 5 weeks for it to arrive or would you use email :D that's what digital offers. a cheaper simpler way of doing what you want with none of the minuses.

now i dunno why either but maybe we like it like sci-fi just what will it be? :D but no matter what, even if film dies (and i love film just not costly...) the future looks mighty sweet for everybody...

EDIT: and you say about prediction. well in digital systems the growth had been actually mathematically predicted bythis guy in the 80s and it evolved exactly by his formula for the past 25 years and he sais will bottom out (due to spacial limitations) around the year 2018. now this is pure math and the limitation is that you reach atom size and well you'd have to come up with another technology. like the optical chip. you heard about that? they made one of those optical computers. a laptop. and the chip is smaller than a coin. so the technology is already available.but it;s not cheap enough yet.
 
Yeah 100 years.

You must be referring to the doubling of speed every 18 months. While that maybe true, it's all based on the same technologies. In order to achieve the kind of jumps you're talking about you'd need to start from scracth on brand new technologies. It can't be based on current designs.

I realize what you're saying about the advances in technology, and I agree they are impressive. But it seems to take forever for them to be implimented in consumber products. The optical computer for instance. How long do you think it will be before it is purchasable by you and me? That's what drags technology down.

The potential is there, but it takes forever to be fully implimented.

Case and point. The first digital sensor (CCD technology) was made in 1970. That was 35 years ago and we're still using the same technology in digital camera's. 35 years! In 1973 the CCD size was 100x100 pixels (10 kilopixel). In 2005 the largest commercially available digital camera (excluding medium / large format) is 16.6 megapixel. A size that should have been reached in 1989 based on the 18 month doubling prediction. Today we should have 20 gigapixel cameras. How long do you think it will be before we see one?

We won't. Not until the current technology is abandoned and new technologies are developed. That's my opinion anyway.
 
first of all no i'm not referring to the doubling every n months that is folklore. what i am referring to is an actual mathematical theory which has foreseen this development. it was once on a show on discovery so i can't quite remeber who said it but trust me it's there... and the growth was exponential.

and what jumps am i saying? if i can think of it then we do have the technology. maybe not to make them fly or fit in your eye but otherwise, it's all there... jumps would only be necessary for film. that's the cool thing with digital, you have one superchip which can do anything :D link more of them together and voila you have a revolutionary thing.

of course as a sf lover i am optimistic but you must agree that it is possible. and the optical computer, well the laptop i was telling you about has a 6GHz processor and 1TB RAM 2TB HDD and it cost just 1800$ ! now if that is a lot i don;'t know what isn't. so you see it is there. and let's see a canon bla bla mark II is 7000$. so the technology is here and is readily available(well not for me but if i woul even make 5$ with it i'd buy it). implementation is a bit of a ***** but otherwise all is possible.

and do not forget that the ccd was not made for capture. and also that now there are CMOS sensors. and i don't get it why they don;t just make bigger sensors.

well the point is to see all opinions and not just argue like madmen :lol: but it's interesting to argue too :D it's interesting to discuss in this thread.
 
I've never heard of the equation you speak of from the discovery channel. I'd be interested to know more...

You didn't say anything about jumps in technology. That was my opinion that they would be needed to achieve the results you spoke of. Current techniques will only go so far before they have to be retired.

You're right about having a superchip and it's limitless possibilities. But there is problems with current technologies. (This next paragraph is pretty much from memory from several years ago. Things could've changed, and I could remember things incorrectly. So take it with a grain of salt.)

CPU's would be significantly faster if heat wasn't a factor. You send an electrical charge through the silicon or whatever is in your cpu to make it work. Adding electricity will speed up the processor but also heats up faster. The end result is you can only send so much electricity through before the chip overheats and becomes unstable. The reason CPU's are continually made smaller is so that they can run faster with the same amount of electricity without raising the amount of heat generated. Eventually however it becomes so small that you run out of room and therefore have to start over in design. Thus creating differences in CPU's. (pentium 3 vs pentium 4 as a quick example)

The big jumps in speed will come with abandoning old technologies completely to avoid these limitations. The optical computer you speak of for example. By abandoning the design of silicon / copper etc to make the CPU, they are able to achieve much greater speeds without the old limitations. $1800?? Did you forget a zero or two zeros?? 1800 bucks is a STEAL! What's great about the optical is it will be able to achieve much better results (1TB ram etc) than the old technologies it overtakes.

The mark II is still based on the old principles, however. CMOS is the next step above CCD but the results are pretty much the same (5-10 MP vs 16 MP). I don't see CMOS taking us even close to the results you are talking about. Eventually someone will discover a whole new way to capture the images to digital. Only then will the next great jump occur.
 
I was interested in the optical computer so I did a quick search. Wikipedi had the following -

wikipedia said:
No true optical computers are declassified or otherwise known to exist.

This is exactly my point. Not only are they not ready for mass production. They aren't even known to exsist!
 
firstly http://atomchip.com/_wsn/page4.html i can't find the commercial page. so wiki needs to rewrite some stuff :D so the laptop i speak of is here, i'm not bul****ting. and wiki has flaws. google doesn't. :D

the equation i'll search. although i have no idea how... and nothing comes up right now. but i'l keep searching. i remember it was one of the really big mathematicians, but i can't put my finger on it...

you speak of current technologie being retired. why? and which ones have retired?

well the heating is not an issue. you already have computers running on liquid nitrogen as coolant so... that isn't the problem, just maybe a minor setback. and yes platforms need shifting and changing but technology in its whole moves in the same direction. i mean the pentium 4 is an evolved platform of the prescott chip if i remember correctly and so on. they all evolve. old chips are dumped for new chips. but that's just evolution. bigger better and smaller :D

and about performance; the optical can reach 6.5GHZ. pentium 4's platform can techincally go up to 10GHz before becoming obsolete... however opinions about this optical computers are split. some say it is a hoax, but there is a lot of info to back their story up. i mean i've seen the atom flow charts. they're talking of using a magnetic trap to trap electrons and use them as ones and zeros. i can;t find the page with the principles but i've read it and it;s for real. also there has been another optical computer i;ve seen in the german chip or smth when i went there about such a computer. that one is real real :D but it;s a bit big. like 2-3times your desktop computer. so it's bulky.
this is the optical RAM: http://atomchip.com/db4/00366/atomchip.com/_uimages/512GBram.JPG

AHA: i've fund the site for the foudation. check out the research page. you'll be amazed... http://pi1.physi.uni-heidelberg.de/physi/atph/index.php

don't let cmos down so fast, it;s only beginning. give it a year or two to grow and then we'll see. but you are right about other means of digital recording. unfortunatelly as pixels make up the digital world it is very hard to think of anything else...


and in the end something you know but which fits into our little discussion:

Bill Gates once said "64k should be enough for anyone"
 
You missed the key word - No true optical computers are declassified or otherwise known to exist.

The laptop seems to use optical to enhance current electron based systems. Which is great since the results are stunning, but a true optical computer would leave that thing in the dust. It seems the keypoints of the laptop are storage size and ram size, but seem to be limited in speed due to it's electron based components.

I'll give you the fact that CMOS is fairly new so it may yet take us leaps and bounds into the next era of digital photography.

I disagree about the pentium 4's being able to go to 10Ghz before becoming obsolete. I think 10Ghz is somewhat of a pipedream for that CPU, perhaps with liquid cooling (which I want to add will take a long time for it to be an accepted component of computers). Truth is the CPU's we see today are running at the very maximum the chipmakers can get them to run stably. If they could get them to run faster we would see them on the market. Instead their engineers are continually looking for ways to tweak the current designs to gain an extra 100Mhz here an extra 100Mhz there. It's impressive what they are able to accomplish, but I would bet my firstborn there will be a new chipset that will take over long before the 10Ghz mark is reached.
 
How many megapixels do you guys need? You can work with 4 but 16 is plenty for a full frame 35mm sensor. And 40??? Give me a break. :lol: Yes, I'd love to have a play with it but unless I have a specialist need for it, it can stay in the shop.

Digital is more neutral at handling colour than film IMO and you get far more colour resolution than you can see - even when you take heavy processing into account.

So why do I want to use film?

Film is warm and has character - digital is cold and clinical. Of course, this could also be a reason for shooting digital. ;)

Digital is just way too expensive - unless you're budgeting to a 5 year plan. :lol:

So I think as long as film has a warmer character and is 10 x cheaper, it will always be the prefered option for some - even if the rest are playing with 192bit gigapixels. :lol:
 
The more you enlarge, the further it should be viewed from which is why we don't look at billboards/posters and say "that's rubbish - I can see all the dots!"

Unless you want to present a large photographic collage type of thing that is meant to be viewed in closer sections; But then I'd call that quite a specialist work and could be done easily, anyway.
 
Marctwo said:
The more you enlarge, the further it should be viewed from which is why we don't look at billboards/posters and say "that's rubbish - I can see all the dots!"

Unless you want to present a large photographic collage type of thing that is meant to be viewed in closer sections; But then I'd call that quite a specialist work and could be done easily, anyway.

That's just it, you don't have to view from a distance with enough megapixels. And I wouldn't consider that specialist work either.

What about all the medium and large format shooters out there? Do you think they are wasting all that film? I don't. It allows them much more flexibility as to what sizes they can and cannot print.
 

Most reactions

Back
Top