Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Finally, you hope for the best. To reach 100% the game has to undergo 2 random checks if all else is perfect. Even though you did everything right, you might still end up with a game that ranges between 96% and 100%.
Hi. If I understand correct, on legendary difficulty every Year +70 more points. So when I start the game in 1980 necessary maximum points are? And for example current points for 1991 =770? Unless You think about 1980: 700 legendary, not 1990?
And last question. If my team reach for example (legendary year 1990) 1200, 1100, 900, 700 points. Everything above 700 are wasted, But i still have 100% of the necessary maximum of points right?
The current necessary maximum in 1980 is 35.
The difficulty scales linearly post year 4. During the first 4 years your numbers get multiplied by a decreasing factor. That starts at 2 in 1980 and ends at 1.1 (or 1.2; can't exactly remember right now) in year 4. So your assumption is correct: Since every year is worth 70 points 1980 should be 70 points, divide by 2 for the factor and you actually just need 35 points.
correct.
Correct. Many players exessively produce points especially for the gameplay counter.
Edit: These numbers actually only are correct for the last month of the preceding year. The maximum rises every month. so a game published in November 1980 actually requires:
(70 * (1 + 11/12)) /2 = 35 *1.9166... = 67.0833..
December 1980 would be 70.
Generelly speaking it usually suffices to take the end of year value to check whether a game is strong enough. As a rule of thumb year x 70 totally works. If you want to be on the safe side: (year +1 ) x70.
Greating from Poland.
Most players that I have seen are hamstrung though, by overgenerating points for one or more (usually Gameplay and Graphics) category while neglecting another (especially technology/control).
The strategy is, that once the specialist departments and some of their improvements are unlocked (you need to research them as soon as possible and use them in your games, otherwise your game will get downgraded), you can curb the production for that category in your main dev room.
An example. You have a team of four balanced devs in you dev room (all four dev skills are at 50). You'd normally want a priority slider setting of 25%/each to evenly allow for points generation. Once QA improvements are unlocked you can decrease the priority for gameplay in your game design to 10%/gameplay and increase the other three to 30%.
Your main dev group will generate less gameplay points, but your specialist department will make up for that, because all they do is generate gameplay points anyway.
I hope everybody is ok with that
The formula used to determine the review score is:
Month is just your usual month count. January is 1. December is 12.
Month / 12 yields fractions of 12 starting at 1/12 and ending at 12/12 = 1 in December.
This is added to year. So in June 1987 you have [1987 - 1979] 8 + [June] 6 / 12 = 8.5.
This value is multiplied by the difficulty factor. 0.35 for easy; 0.55 for Normal; 0.65 for hard; 0.7 for legendary.
You can see that your linearly increasing time factor is decreased the harder the lower your difficulty is. December 1989 is 10 [9 + 12/12] multiplied by the difficulty factors you get values between 3.5 and 7.
The points you have generated are divided by that combined time and difficulty quotient. The smaller the quotient the higher result.
Take a game that has 300 gameplay points in December '89.
If you are looking for how many points you actually need to get 100% (ie determine wastage and the necessary maximum of points):
100 * difficulty factor yields 35, 55, 65 and 70 for the different difficulties, ie you can use these values multiplied by year to eyeball the necessary points.
Alas, you don't reach 100% during the review process.Gameplay and Total are cut off at different points. So using the 100% baseline means that you are slightly overgenerating points.
If you want to be exactly precise.
Before the cut-off at 80 for gameplay, at 88 for total and at 100% for the other three [SFX, GFX, control] your intermediary results are weighted according to genre and added to form the total score.
Let's take an adventure for the next example.
Adventure is weighted for GFX, SFX, Control, Gameplay at 0.2, 0.1, 0.3 and 0.4.
Let's assume the game has enough points to satisfy the necessary maximum.
Total = 0.2 * 100 [GFX] + 0.1 [SFX] * 100 + 0.3 * 100 [Control] + 0.4 * 100 [Gameplay] = 1 * 100 = 100.
That was easy. Gameplay gets cutoff at 80. Total at 88.
Let's assume the game wasn't perfect. It had subscores of 90%, 110%, 70% and 160%.
Total = 0.2 * 90 [GFX] + 0.1 * 110 [SFX] + 0.3 * 70 [Control] + 0.4 * 160 [Gameplay] = 18 + 11 + 21 + 40 = 90%.
Gameplay still gets cutoff at 80. Total still manages to get past the cut-off point of 88% despite a severe lack in control.
This lack will come back later to haunt the review score in the form of deductions.
But that would lead too far.
So to summarise:
Year [>1] * difficulty factor [0.35;0.55;0.65;0.7] * 100 is the formula in use to determine how many points are necessary at any given time.
I'd have uploaded a graph to steam to illustrate that point, if steam had let me upload my 'artwork'. But alas, I guess my graph wasn't arty enough...
this raises the question if/when/how gameplay cutoff is raised above 80?
in your adventure example, say Easy 1989 for instance, GFX would still need 350 not 350*0.2=70, 0.2 is just the dev weighting, correct?
also, the year you start does not change the (year-1979) factor, correct?
maybe consider uploading your fartsy-challenged graph to google docs
(guide edited to pending)
If one of the game design settings (notched sliders) is off you will incur a penalty to gameplay, if one of the available improvements (QA, graphics studio, sound studio, motion capture )isn't used for the game you will incur a penalty; if bugs remain you will incur a penalty; if any of the other three categories is rated at below 90% you will incur penalties to gameplay.
So you have only one shot at a >90% gameplay game but lots of opportunities to lose points.
The weight of the subcategories is given in the game report. Most people have mistaken them to represent the "perfect settings" for the priority sliders since both operate with percentage points.
So your assumption is correct. You still need enough points to satisfy year * difficulty factor * 100. So for 1989 that would be 350 < x < 385.
I always started on 1980, so I can't definitely answer your last question. I will look to get an answer to that. Functionally there shouldn't be a difference between a 1980 and a 2000 start. You'd still have a bad developer to start with. You still wouldn't be capable to produce anything but a 20-odd points game.
It will also tell you what subgenres are good combinations and whether the topic and subtopic match up well.
Get all the values with green ticks on an engine with all the available features and you should be in the right ballpark if not hitting home runs.
The hard part is getting in enough sequals to hone in on the optimum values before the in fashion topic and genre change again.