USD/JPY Forecast: Firmer advance once above 106.25 ...

What is the quickest solution to finding a 4 digit number asking only yes/no questions?

A friend and I were watching a Korean game show called "The Genius", and in it they had a particularly brilliant competitive maths game.
The premise was fairly simple - Each contestant had to pick a four digit number. They then were allowed to ask questions to each other one after the other, or use a turn to guess what the opponent's number was. The only additional rule was that 0 was treated as even for the purposes of questioning.
After watching this, my friend and I tried to come up with a solution to guarantee finding your opponent's number in the fewest possible questions, but it very quickly got extremely complex. However, we're both fairly sure that there's a clever mathematical answer to guarantee it in a low amount of questions.
After working out the number you need to use a turn to guess, so once the number is worked out a +1 needs to be added. (This isn't important if you know the number, but I figure can be relevant if you can get down to say 3 potential options, since just guessing all 3 is as efficient as working out which one it is)
The obvious first approach we tried was binary searching the numbers for each digit. With this method, each digit could be found in a maximum of 4 questions (10 to 5 to 3 to 2 to 1), so we knew that we needed to try and beat 16 questions.
We then realised that if we treated the first two digits and the second two digits as two digit numbers, it would only take a maximum of 7 questions to find each digit pair (100 to 50 to 25 to 13 to 7 to 4 to 2 to 1), so we were down to 14 questions.
Following the theme, we tested binary search on all 4 digits, but realised it would take 14 questions (10000 to 5000 to 2500 to 1250 to 625 to 313 to 157 to 79 to 40 to 20 to 10 to 5 to 3 to 2 to 1), resulting in 14. It was no more efficient than the two sets of digits, and was also harder to calculate.
I then tried a set of 3 digits and a single, (1000 to 500 to 250 to 125 to 63 to 32 to 16 to 8 to 4 to 2 to 1) + 4 for the remaining digit, and again this was 14.
I then proposed a different solution - Could we potentially get more information by adding the digits together? I tried it on a number he had picked, asking questions to do a binary search on the sum of pairs of digits. Assigning the letters abcd to the four digits, I worked out a+b, c+d, b+c, and a+d. I figured doing this would allow me to arrange the numbers correctly once I had crunched it out.
Since the number was a 4 digit number, I knew the maximum the total for all four numbers could reach was 36. That meant each pair was a maximum of 18, and sets of pairs had to add up to no more than 36.
I started off by binary searching the sum of the first two digits, which would take a maximum of 5 turns (18 to 9 to 5 to 3 to 2 to 1), and repeated for the second two. That would take 10 turns, but give me the sum of all digits, the sum of the first two, and the sum of the second two.
At this point I was adamant that I could potentially figure out the number using this information alone, but I was unsuccessful. I was able to use logic to narrow down the possible values for the outer two and inner two numbers by figuring out the total of all four digits, figuring out the number pair combinations a/b and c/d could be to satisfy that, and then working out the potential pairs of values that the inner two and outer two numbers could add up to.
In our example test, a+b was 5, c+d was 14. From this I knew the total was 19. The first pair of numbers had to be 0 and 5, 1 and 4, or 2 and 3. The second pair had to be 9 and 5, 8 and 6, or 7 and 7. Using the logic of adding the highest number of one pair to the highest number of the other pair and then cycling through the values, I worked out that the outer two numbers had to add up to 14, 13, 12, or 11, and the inner two numbers had to add up to 8, 7, 6, or 5.
Binary searching these could be done in two searches each, bringing the total to... 14. :(
This is where it turned a bit weird though - After doing some logic on the resulting numbers (the outer pair was 12 and the inner pair was 7), I came up with three potential answers that satisfied every single constraint.
5077 4168 3259
These three numbers are amazing. The first two digits add up to 5, the third and fourth add up to 14, the first and fourth add up to 12, and the second and third add up to 7.
Unfortunately from here there was no choice but to guess all three, no amount of questioning could lower it from three to two questions. And so our final total was 17, no more efficient than just binary searching the numbers in the first place.
And so, I ask you this - Is there a more efficient, human doable way to discover the four digit number than binary searching the first pair and second pair of digits?
I feel like there has to be, but I'm not knowledgeable enough to know!
submitted by Jademalo to mathematics [link] [comments]

Subreddit Demographic Survey 2020 : The Results

2020 Childfree Subreddit Survey

1. Introduction

Once a year, this subreddit hosts a survey in order to get to know the community a little bit and in order to answer questions that are frequently asked here. Earlier this summer, several thousand of you participated in the 2020 Subreddit Demographic Survey. Only those participants who meet our wiki definition of being childfree's results were recorded and analysed.
Of these people, multiple areas of your life were reviewed. They are separated as follows:

2. Methodology

Our sample is redditors who saw that we had a survey currently active and were willing to complete the survey. A stickied post was used to advertise the survey to members.

3. Results

The raw data may be found via this link.
7305 people participated in the survey from July 2020 to October 2020. People who did not meet our wiki definition of being childfree were excluded from the survey. The results of 5134 responders, or 70.29% of those surveyed, were collated and analysed below. Percentages are derived from the respondents per question.

General Demographics

Age group

Age group Participants Percentage
18 or younger 309 6.02%
19 to 24 1388 27.05%
25 to 29 1435 27.96%
30 to 34 1089 21.22%
35 to 39 502 9.78%
40 to 44 223 4.35%
45 to 49 81 1.58%
50 to 54 58 1.13%
55 to 59 25 0.49%
60 to 64 13 0.25%
65 to 69 7 0.14%
70 to 74 2 0.04%
82.25% of the sub is under the age of 35.

Gender and Gender Identity

Age group Participants # Percentage
Agender 62 1.21%
Female 3747 73.04%
Male 1148 22.38%
Non-binary 173 3.37%

Sexual Orientation

Sexual Orientation Participants # Percentage
Asexual 379 7.39%
Bisexual 1177 22.93%
Heterosexual 2833 55.20%
Homosexual 264 5.14%
It's fluid 152 2.96%
Other 85 1.66%
Pansexual 242 4.72%

Birth Location

Because the list contains over 120 countries, we'll show the top 20 countries:
Country of birth Participants # Percentage
United States 2775 57.47%
United Kingdom 367 7.60%
Canada 346 7.17%
Australia 173 3.58%
Germany 105 2.17%
Netherlands 67 1.39%
India 63 1.30%
Poland 57 1.18%
France 47 0.97%
New Zealand 42 0.87%
Mexico 40 0.83%
Brazil 40 0.83%
Sweden 38 0.79%
Finland 31 0.64%
South Africa 30 0.62%
Denmark 28 0.58%
China 27 0.56%
Ireland 27 0.56%
Phillipines 24 0.50%
Russia 23 0.48%
90.08% of the participants were born in these countries.
These participants would describe their current city, town or neighborhood as:
Region Participants # Percentage
Rural 705 13.76
Suburban 2661 51.95
Urban 1756 34.28

Ethnicity

Ethnicity Participants # Percentage
African Descent/Black 157 3.07%
American Indian or Alaskan Native 18 0.35%
Arabic/Middle Eastern/Near Eastern 34 0.66%
Bi/Multiracial 300 5.86%
Caucasian/White 3946 77.09%
East Asian 105 2.05%
Hispanic/Latinx 271 5.29%
Indian/South Asian 116 2.27%
Indigenous Australian/Torres Straight IslandeMaori 8 0.16%
Jewish (the ethnicity, not religion) 50 0.98%
Other 32 0.63%
Pacific IslandeMelanesian 4 0.08%
South-East Asian 78 1.52%

Education

Highest Current Level of Education

Highest Current Level of Education Participants # Percentage
Associate's degree 233 4.55%
Bachelor's degree 1846 36.05%
Did not complete elementary school 2 0.04%
Did not complete high school 135 2.64%
Doctorate degree 121 2.36%
Graduated high school / GED 559 10.92%
Master's degree 714 13.95%
Post Doctorate 19 0.37%
Professional degree 107 2.09%
Some college / university 1170 22.85%
Trade / Technical / Vocational training 214 4.18%
Degree (Major) Participants # Percentage
Architecture 23 0.45%
Arts and Humanities 794 15.54%
Business and Economics 422 8.26%
Computer Science 498 9.75%
Education 166 3.25%
Engineering Technology 329 6.44%
I don't have a degree or a major 1028 20.12%
Law 124 2.43%
Life Sciences 295 5.77%
Medicine and Allied Health 352 6.89%
Other 450 8.81%
Physical Sciences 199 3.89%
Social Sciences 430 8.41%

Career and Finances

The top 10 industries our participants are working in are:
Industry Participants # Percentage
Information Technology 317 6.68%
Health Care 311 6.56%
Education - Teaching 209 4.41%
Engineering 203 4.28%
Retail 182 3.84%
Government 172 3.63%
Admin & Clerical 154 3.25%
Restaurant - Food Service 148 3.12%
Customer Service 129 2.72%
Design 127 2.68%
Note that "other", "I'm a student", "currently unemployed" and "I'm out of the work force for health or other reasons" have been disregarded for this part of the evaluation.
Out of the 3729 participants active in the workforce, the majority (1824 or 48.91%) work between 40-50 hours per week with 997 or 26.74% working 30-40 hours weekly. 6.62% work 50 hours or more per week, and 17.73% less than 30 hours.
513 or 10.13% are engaged in managerial responsibilities (ranging from Jr. to Sr. Management).
On a scale of 1 (lowest) to 10 (highest), the overwhelming majority (3340 or 70%) indicated that career plays a very important role in their lives, attributing a score of 7 and higher.
1065 participants decided not to disclose their income brackets. The remaining 4,849 are distributed as follows:
Income Participants # Percentage
$0 to $14,999 851 21.37%
$15,000 to $29,999 644 16.17%
$30,000 to $59,999 1331 33.42%
$60,000 to $89,999 673 16.90%
$90,000 to $119,999 253 6.35%
$120,000 to $149,999 114 2.86%
$150,000 to $179,999 51 1.28%
$180,000 to $209,999 25 0.63%
$210,000 to $239,999 9 0.23%
$240,000 to $269,999 10 0.25%
$270,000 to $299,999 7 0.18%
$300,000 or more 15 0.38%
87.85% earn under $90,000 USD a year.
65.82% of our childfree participants do not have a concrete retirement plan (savings, living will).

Religion and Spirituality

Faith Originally Raised In

There were more than 50 options of faith, so we aimed to show the top 10 most chosen beliefs.
Faith Participants # Percentage
Catholicism 1573 30.76%
None (≠ Atheism. Literally, no notion of spirituality or religion in the upbringing) 958 18.73%
Protestantism 920 17.99%
Other 431 8.43%
Atheism 318 6.22%
Agnosticism 254 4.97%
Anglicanism 186 3.64%
Judaism 77 1.51%
Hinduism 75 1.47%
Islam 71 1.39%
This top 10 amounts to 95.01% of the total participants.

Current Faith

There were more than 50 options of faith, so we aimed to show the top 10 most chosen beliefs:
Faith Participants # Percentage
Atheism 1849 36.23%
None (≠ Atheism. Literally, no notion of spirituality or religion currently) 1344 26.33%
Agnosticism 789 15.46%
Other 204 4.00%
Protestantism 159 3.12%
Paganism 131 2.57%
Spiritualism 101 1.98%
Catholicism 96 1.88%
Satanism 92 1.80%
Wicca 66 1.29%
This top 10 amounts to 94.65% of the participants.

Level of Current Religious Practice

Level Participants # Percentage
Wholly seculanon religious 3733 73.73%
Identify with religion, but don't practice strictly 557 11.00%
Lapsed/not serious/in name only 393 7.76%
Observant at home only 199 3.93%
Observant at home. Church/Temple/Mosque/etc. attendance 125 2.47%
Strictly observant, Church/Temple/Mosque/etc. attendance, religious practice/prayeworship impacting daily life 56 1.11%

Effect of Faith over Childfreedom

Figure 1

Effect of Childfreedom over Faith

Figure 2

Romantic and Sexual Life

Current Dating Situation

Status Participants # Percentage
Divorced 46 0.90%
Engaged 207 4.04%
Long term relationship, living together 1031 20.10%
Long term relationship, not living with together 512 9.98%
Married 1230 23.98%
Other 71 1.38%
Separated 18 0.35%
Short term relationship 107 2.09%
Single and dating around, but not looking for anything serious 213 4.15%
Single and dating around, looking for something serious 365 7.12%
Single and not looking 1324 25.81%
Widowed 5 0.10%

Childfree Partner

Is your partner childfree? If your partner wants children and/or has children of their own and/or are unsure about their position, please consider them "not childfree" for this question.
Partner Participants # Percentage
I don't have a partner 1922 37.56%
I have more than one partner and none are childfree 3 0.06%
I have more than one partner and some are childfree 35 0.68%
I have more than one partner and they are all childfree 50 0.98
No 474 9.26%
Yes 2633 51.46%

Dating a Single Parent

Would the childfree participants be willing to date a single parent?
Answer Participants # Percentage
No, I'm not interested in single parents and their ties to parenting life 4610 90.13%
Yes, but only if it's a short term arrangement of some sort 162 3.17%
Yes, whether for long term or short term, but with some conditions (must not have child custody, no kid talk, etc.), as long as I like them and long as we're compatible 199 3.89%
Yes, whether for long term or short term, with no conditions, as long as I like them and as long as we are compatible 144 2.82%

Childhood and Family Life

On a scale from 1 (very unhappy) to 10 (very happy), how would you rate your childhood?
Figure 3
Of the 5125 childfree people who responded to the question, 67.06% have a pet or are heavily involved in the care of someone else's pet.

Sterilisation

Sterilisation Status

Sterilisation Status Participants # Percentage
No, I am not sterilised and, for medical, practical or other reasons, I do not need to be 869 16.96%
No. However, I've been approved for the procedure and I'm waiting for the date to arrive 86 1.68%
No. I am not sterilised and don't want to be 634 12.37%
No. I want to be sterilised but I have started looking for a doctorequested the procedure 594 11.59%
No. I want to be sterilised but I haven't started looking for a doctorequested the procedure yet 2317 45.21%
Yes. I am sterilised 625 12.20%

Age when starting doctor shopping or addressing issue with doctor. Percentages exclude those who do not want to be sterilised and who have not discussed sterilisation with their doctor.

Age group Participants # Percentage
18 or younger 207 12.62%
19 to 24 588 35.85%
25 to 29 510 31.10%
30 to 34 242 14.76%
35 to 39 77 4.70%
40 to 44 9 0.55%
45 to 49 5 0.30%
50 to 54 1 0.06%
55 or older 1 0.06%

Age at the time of sterilisation. Percentages exclude those who have not and do not want to be sterilised.

Age group Participants # Percentage
18 or younger 5 0.79%
19 to 24 123 19.34%
25 to 29 241 37.89%
30 to 34 168 26.42%
35 to 39 74 11.64%
40 to 44 19 2.99%
45 to 49 1 0.16%
50 to 54 2 0.31%
55 or older 3 0.47%

Elapsed time between requesting procedure and undergoing procedure. Percentages exclude those who have not and do not want to be sterilised.

Time Participants # Percentage
Less than 3 months 330 50.46%
Between 3 and 6 months 111 16.97%
Between 6 and 9 months 33 5.05%
Between 9 and 12 months 20 3.06%
Between 12 and 18 months 22 3.36%
Between 18 and 24 months 15 2.29%
Between 24 and 30 months 6 0.92%
Between 30 and 36 months 2 0.31%
Between 3 and 5 years 40 6.12%
Between 5 and 7 years 25 3.82%
More than 7 years 50 7.65%

How many doctors refused at first, before finding one who would accept?

Doctor # Participants # Percentage
None. The first doctor I asked said yes 604 71.73%
One. The second doctor I asked said yes 93 11.05%
Two. The third doctor I asked said yes 54 6.41%
Three. The fourth doctor I asked said yes 29 3.44%
Four. The fifth doctor I asked said yes 12 1.43%
Five. The sixth doctor I asked said yes 8 0.95%
Six. The seventh doctor I asked said yes 10 1.19%
Seven. The eighth doctor I asked said yes 4 0.48%
Eight. The ninth doctor I asked said yes 2 0.24%
I asked more than 10 doctors before finding one who said yes 26 3.09%

Childfreedom

Primary Reason to Not Have Children

Reason Participants # Percentage
Aversion towards children ("I don't like children") 1455 28.36%
Childhood trauma 135 2.63%
Current state of the world 110 2.14%
Environmental (including overpopulation) 158 3.08%
Eugenics ("I have 'bad genes'") 57 1.11%
Financial 175 3.41%
I already raised somebody else who isn't my child 83 1.62%
Lack of interest towards parenthood ("I don't want to raise children") 2293 44.69%
Maybe interested for parenthood, but not suited for parenthood 48 0.94%
Medical ("I have a condition that makes conceiving/bearing/birthing children difficult, dangerous or lethal") 65 1.27%
Other 68 1.33%
Philosophical / Moral (e.g. antinatalism) 193 3.76%
Tokophobia (aversion/fear of pregnancy and/or chidlbirth) 291 5.67%
95.50% of childfree people are pro-choice, however only 55.93% of childfree people support financial abortion.

Dislike Towards Children

Figure 4

Working With Children

Work Participants # Percentage
I'm a student and my future job/career will heavily makes me interact with children on a daily basis 67 1.30%
I'm retired, but I used to have a job that heavily makes me interact with children on a daily basis 6 0.12%
I'm unemployed, but I used to have a job that heavily makes me interact with children on a daily basis 112 2.19%
No, I do not have a job that makes me heavily interact with children on a daily basis 4493 87.81%
Other 148 2.89%
Yes, I do have a job that heavily makes me interact with children on a daily basis 291 5.69%

4. Discussion

Child Status

This section solely existed to sift the childfree from the fencesitters and the non childfree in order to get answers only from the childfree. Childfree, as it is defined in the subreddit, is "I do not have children nor want to have them in any capacity (biological, adopted, fostered, step- or other) at any point in the future." 70.29% of participants actually identify as childfree, slightly up from the 2019 survey, where 68.5% of participants identified as childfree. This is suprising in reflection of the overall reputation of the subreddit across reddit, where the subreddit is often described as an "echo chamber".

General Demographics

The demographics remain largely consistent with the 2019 survey. However, the 2019 survey collected demographic responses from all participants in the survey, removing those who did not identify as childfree when querying subreddit specific questions, while the 2020 survey only collected responses from people who identified as childfree. This must be considered when comparing results.
82.25% of the participants are under 35, compared with 85% of the subreddit in the 2019 survey. A slight downward trend is noted compared over the last two years suggesting the userbase may be getting older on average. 73.04% of the subreddit identify as female, compared with 71.54% in the 2019 survey. Again, when compared with the 2019 survey, this suggests a slight increase in the number of members who identify as female. This is in contrast to the overall membership of Reddit, estimated at 74% male according to Reddit's Wikipedia page [https://en.wikipedia.org/wiki/Reddit#Users_and_moderators]. The ratio of members who identify as heterosexual remained consistent, from 54.89% in the 2019 survey to 55.20% in the 2020 survey.
Ethnicity wise, 77% of members identified as primarily Caucasian, consistent with the 2019 results. While the ethnicities noted to be missing in the 2019 survey have been included in the 2020 survey, some users noted the difficulty of responding when fitting multiple ethnicities, and this will be addressed in the 2021 survey.

Education level

As it did in the 2019 survey, this section highlights the stereotype of childfree people as being well educated. 2.64% of participants did not complete high school, which is a slight decrease from the 2019 survey, where 4% of participants did not graduate high school. However, 6.02% of participants are under 18, compared with 8.22% in the 2019 survey. 55% of participants have a bachelors degree or higher, while an additional 23% have completed "some college or university".
At the 2020 survey, the highest percentage of responses under the: What is your degree/major? question fell under "I don't have a degree or a major" (20.12%). Arts and Humanities, and Computer Science have overtaken Health Sciences and Engineering as the two most popular majors. However, the list of majors was pared down to general fields of study rather than highly specific degree majors to account for the significant diversity in majors studied by the childfree community, which may account for the different results.

Career and Finances

The highest percentage of participants at 21.61% listed themselves as trained professionals.
One of the stereotypes of the childfree is of wealth. However this is not demonstrated in the survey results. 70.95% of participants earn under $60,000 USD per annum, while 87.85% earn under $90,000 per annum. 21.37% are earning under $15,000 per annum. 1065 participants, or 21.10% chose not to disclose this information. It is possible that this may have skewed the results if a significant proportion of these people were our high income earners, but impossible to explore.
A majority of our participants work between 30 and 50 hours per week (75.65%) which is slightly increased from the 2019 survey, where 71.2% of participants worked between 30 and 50 hours per week.

Location

The location responses are largely similar to the 2019 survey with a majority of participants living in a suburban and urban area. 86.24% of participants in the 2020 survey live in urban and suburban regions, with 86.7% of participants living in urban and suburban regions in the 2019 survey. There is likely a multifactorial reason for this, encompassing the younger, educated skew of participants and the easier access to universities and employment, and the fact that a majority of the population worldwide localises to urban centres. There may be an element of increased progressive social viewpoints and identities in urban regions, however this would need to be explored further from a sociological perspective to draw any definitive conclusions.
A majority of our participants (57.47%) were born in the USA. The United Kingdom (7.6%), Canada (7.17%), Australia (3.58%) and Germany (2.17%) encompass the next 4 most popular responses. This is largely consistent with the responses in the 2019 survey.

Religion and Spirituality

For the 2020 survey Christianity (the most popular result in 2019) was split into it's major denominations, Catholic, Protestant, Anglican, among others. This appears to be a linguistic/location difference that caused a lot of confusion among some participants. However, Catholicism at 30.76% remained the most popular choice for the religion participants were raised in. However, of our participant's current faith, Aetheism at 36.23% was the most popular choice. A majority of 78.02% listed their current religion as Aetheist, no religious or spiritual beliefs, or Agnostic.
A majority of participants (61%) rated religion as "not at all influential" to the childfree choice. This is consistent with the 2019 survey where 62.8% rated religion as "not at all influential". Despite the high percentage of participants who identify as aetheist or agnostic, this does not appear to be related to or have an impact on the childfree choice.

Romantic and Sexual Life

60.19% of our participants are in a relationship at the time of the survey. This is consistent with the 2019 survey, where 60.7% of our participants were in a relationship. A notable proportion of our participants are listed as single and not looking (25.81%) which is consistent with the 2019 survey. Considering the frequent posts seeking dating advice as a childfree person, it is surprising that such a high proportion of the participants are not actively seeking out a relationship. Unsurprisingly 90.13% of our participants would not consider dating someone with children. 84% of participants with partners of some kind have at least one childfree partner. This is consistent with the often irreconcilable element of one party desiring children and the other wishing to abstain from having children.

Childhood and Family Life

Overall, the participants skew towards a happier childhood.

Sterilisation

While just under half of our participants wish to be sterilised, 45.21%, only 12.2% have been successful in achieving sterilisation. This is likely due to overarching resistance from the medical profession however other factors such as the logistical elements of surgery and the cost may also contribute. There is a slight increase from the percentage of participants sterilised in the 2019 survey (11.7%). 29.33% of participants do not wish to be or need to be sterilised suggesting a partial element of satisfaction from temporary birth control methods or non-necessity of contraception due to their current lifestyle practices. Participants who indicated that they do not wish to be sterilised or haven't achieved sterilisation were excluded from the percentages where necessary in this section.
Of the participants who did achieve sterilisation, a majority began the search between 19 and 29, with the highest proportion being in the 19-24 age group (35.85%) This is a marked increase from the 2019 survey where 27.3% of people who started the search were between 19-24. This may be due to increased education about permanent contraception or possibly due to an increase in instability around world events.
The majority of participants who sought out and were successful at achieving sterilisation, were however in the 25-29 age group (37.9%). This is consistent with the 2019 survey results.
The time taken between seeking out sterilisation and achieving it continues to increase, with only 50.46% of participants achieving sterilisation in under 3 months. This is a decline from the number of participants who achieved sterilisation in 3 months in the 2019 survey (58.5%). A potential cause of this decrease is to Covid-19 shutdowns in the medical industry leading to an increase in procedure wait times. The proportion of participants who have had one or more doctors refuse to perform the procedure has stayed consistent between the two surveys.

Childfreedom

The main reasons for people choosing the childfree lifestyle are a lack of interest towards parenthood and an aversion towards children which is consistent with the 2019 survey. Of the people surveyed 67.06% are pet owners or involved in a pet's care, suggesting that this lack of interest towards parenthood does not necessarily mean a lack of interest in all forms of caretaking. The community skews towards a dislike of children overall which correlates well with the 87.81% of users choosing "no, I do not have, did not use to have and will not have a job that makes me heavily interact with children on a daily basis" in answer to, "do you have a job that heavily makes you interact with children on a daily basis?". This is an increase from the 2019 survey.
A vast majority of the subreddit identifes as pro-choice (95.5%), a slight increase from the 2019 results. This is likely due to a high level of concern about bodily autonomy and forced birth/parenthood. However only 55.93% support financial abortion, aka for the non-pregnant person in a relationship to sever all financial and parental ties with a child. This is a marked decrease from the 2019 results, where 70% of participants supported financial abortion.
Most of our users realised that did not want children young. 58.72% of participants knew they did not want children by the age of 18, with 95.37% of users realising this by age 30. This correlates well with the age distribution of participants. Despite this early realisation of our childfree stance, 80.59% of participants have been "bingoed" at some stage in their lives.

The Subreddit

Participants who identify as childfree were asked about their interaction with and preferences with regards to the subreddit at large. Participants who do not meet our definition of being childfree were excluded from these questions.
By and large our participants were lurkers (72.32%). Our participants were divided on their favourite flairs with 38.92% selecting "I have no favourite". The next most favourite flair was "Rant", at 16.35%. Our participants were similarly divided on their least favourite flair, with 63.40% selecting "I have no least favourite". In light of these results the flairs on offer will remain as they have been through 2019.
With regards to "lecturing" posts, this is defined as a post which seeks to re-educate the childfree on the practices, attitudes and values of the community, particularly with regards to attitudes towards parenting and children, whether at home or in the community. A commonly used descriptor is "tone policing". A small minority of the survey participants (3.36%) selected "yes" to allowing all lectures, however 33.54% responded "yes" to allowing polite, respectful lectures only. In addition, 45.10% of participants indicated that they were not sure if lectures should be allowed. Due to the ambiguity of responses, lectures will continue to be not allowed and removed.
Many of our participants (36.87%) support the use of terms such as breeder, mombie/moo, daddict/duh on the subreddit, with a further 32.63% supporting use of these terms in context of bad parents only. This is a slight drop from the 2019 survey. In response to this use of the above and similar terms to describe parents remains permitted on this subreddit. However, we encourage users to keep the use of these terms to bad parents only.
44.33% of users support the use of terms to describe children such as crotchfruit on the subreddit, a drop from 55.3% last year. A further 25.80% of users supporting the use of this and similar terms in context of bad children only, an increase from 17.42% last year. In response to this use of the above and similar terms to describe children remains permitted on this subreddit.
69.17% of participants answered yes to allowing parents to post, provided they stay respectful. In response to this, parent posts will continue to be allowed on the subreddit. As for regret posts, which were to be revisited in this year's survey, only 9.5% of participants regarded them as their least favourite post. As such they will continue to stay allowed.
64% of participants support under 18's who are childfree participating in the subreddit with a further 19.59% allowing under 18's to post dependent on context. Therefore we will continue to allow under 18's that stay within the overall Reddit age requirement.
There was divide among participants as to whether "newbie" questions should be removed. An even spread was noted among participants who selected remove and those who selected to leave them as is. We have therefore decided to leave them as is. 73.80% of users selected "yes, in their own post, with their own "Leisure" flair" to the question, "Should posts about pets, travel, jetskis, etc be allowed on the sub?" Therefore we will continue to allow these posts provided they are appropriately flaired.

5. Conclusion

Thank you to our participants who contributed to the survey. This has been an unusual and difficult year for many people. Stay safe, and stay childfree.

submitted by Mellenoire to childfree [link] [comments]

r/pharmacy 2020 demographics survey results!

The pharmacy 2020 demographics survey results are here! There were 258 respondents this year. Please note that the numbers will not necessarily add up to 100%, since all questions were optional. Sorry in advance for the crappy Excel graphs.
Location
Most respondents hailed from the US (233; 90.3%), followed by Canada (10; 3.9%), United Kingdom (8; 3.1%), New Zealand (2; 0.8%), and 1 respondent each from Australia, Indonesia, Slovakia, Sweden, and Taiwan.
Of the 233 Americans, the top 3 states were California (20; 8.6%), Pennsylvania (18; 7.7%), and Texas (18; 7.7%).
The 10 Canadians were from Ontario (5; 50%), British Columbia (2; 20%), Alberta (1; 10%), Nova Scotia (1; 10%), and Quebec (1; 10%).
Demographics
Of the 258 respondents, 130 (50.4%) identified as female, 123 (47.7%) as male, and 3 (1.2%) as non-binary.
Age distribution is shown in the below table. A few statistics: minimum 19, maximum 68, mean 29.0, median 28, mode 26.
https://preview.redd.it/qxyxs2sj09c51.png?width=554&format=png&auto=webp&s=202bef88a53fa8596182435590ba9de8eb3646c9
In terms of race/ethnicity, the categories from most to least common were as follows: white (156; 60.5%), Asian (55; 21.3%), 2 or more races (11; 4.3%), black (9; 3.5%), Hispanic or Latino (8; 3.1%), Indian subcontinent (6; 2.3%), Arab (4; 1.6%), Native American or American Indian (2; 0.8%), and Armenian (1; 0.4%).
General employment questions
Of the 258 respondents, 169 (65.5%) were pharmacists, 55 (21.3%) were pharmacy students, 22 (8.5%) were non-pharmacist staff, and 8 (3.1%) were pre-pharmacy students. There were also 1 each of the following: corporate pharmacy compliance, pharmacy wholesaler, pharmacology student, and other healthcare professional.
Most respondents (169; 65.5%) were employed full time (defined as > 30 hours/week), while 19 (7.4%) were employed part time. 49 respondents (19.0%) were full time students (not necessarily in pharmacy), 13 (5.0%) were unemployed, 4 (1.6%) worked outside of the field of pharmacy, 2 (0.8%) were self-employed, 1 (0.4%) was retired, and 1 (0.4%) was consulting/contracting.
There was a nearly equal split between respondents working in suburban (99; 38.4%) vs. urban (97; 37.6%) locations, followed by 21 (8.1%) in rural locations and 15 (5.8%) working remotely (apologies - I should have made this question/response more clear, but based on a jump compared to last year's survey, I think people working from home temporarily due to COVID-19 may have chosen this option).
A pie chart of primary place of employment is shown below, with the top 7 responses shown in the legend: community/retail (136; 52.7%), hospital including outpatient (48; 18.6%), pharmaceutical industry including CROs (11; 4.3%), mail ordespecialty/home infusion (9; 3.5%), unemployed (8; 3.1%), long-term care/hospice (8; 3.1%), and ambulatory care (5; 1.9%). Please note that the unemployed category includes non-working full time students.
https://preview.redd.it/csyipt0hs9c51.png?width=297&format=png&auto=webp&s=3b91337feb634a61730ccfbdd09aa8a0fdda6d7a
A small proportion (42; 16.3%) of respondents reported having a second job. Of these, the most common fields of employment were: hospital including outpatient (10; 23.8%), community/retail (8; 19.0%), and self employment/side hustle (7; 16.7%).
Salary
For the following charts, I only included those working full time. Below is a histogram for full time pharmacist salary worldwide, as well as a table showing some stats for global, US, and ex-US salaries.
https://preview.redd.it/n16j31x1v9c51.png?width=447&format=png&auto=webp&s=624581f5b94c917c417ac39da92cf9eb4c77130c
Global (139 responses) US (130 responses) Ex-US (9 responses)
Minimum $11,000 $11,000 $43,050
Maximum $300,000 $300,000 $230,000
Mean $116,284 $118,909 $78,375
Median $120,000 $120,961 $63,000
Below is the histogram for full time non-pharmacist staff worldwide. There was only 1 ex-US respondent, so I didn't separate out the stats. Here they are: minimum $15,000; maximum $72,000; mean $37,767; median $37,000.
https://preview.redd.it/q2w4f7t5y9c51.png?width=384&format=png&auto=webp&s=300c4992413830cb45befa7ffa9e24e9d5c2370d
Community/retail pharmacy
The pie chart shown below shows the breakdown of pharmacy type for the 136 respondents working in community/retail pharmacy. I'm not exactly sure what Genoa means, so I left that one as is.
https://preview.redd.it/begscv9fz9c51.png?width=288&format=png&auto=webp&s=c22e8ba0797ef1829bb9f0b30db9351b059a3264
Roles within community/retail pharmacy are displayed below.
https://preview.redd.it/l6l3w94zz9c51.png?width=265&format=png&auto=webp&s=ff10c40fd56bc3334762c06a5e6dc4e61a1004d8
The pie chart below displays responses regarding the impact of COVID-19 on hours/salary.
https://preview.redd.it/ugvcv06fbac51.png?width=276&format=png&auto=webp&s=60e055f753ed52c69220fb00e8ef817672804ebd
Hospital pharmacy (including outpatient)
There were 48 respondents working in hospital pharmacy. Bed count at their institutions is shown in the graph below.
https://preview.redd.it/1mv5r0ne1ac51.png?width=382&format=png&auto=webp&s=f46fa5df7be0c5b7d24603a218043fe4cb92f1bd
Roles within hospital pharmacy are displayed below.
https://preview.redd.it/4a3xewk72ac51.png?width=280&format=png&auto=webp&s=9d9a2dbd99882673300ad51e43808d90eb35d8a4
Of the 38 hospital pharmacists, 13 (34.2%) had completed a residency, and 5 (13.2%) were currently completing a residency. The remainder (20; 52.6%) were not pursuing nor had ever completed a residency.
The top 3 clinical specialties were ambulatory care, emergency medicine, and oncology (3 respondents each). Note that it was possible to choose more than 1 specialty.
The pie chart below displays responses regarding the impact of COVID-19 on hours/salary.
https://preview.redd.it/b9pj5l3sbac51.png?width=278&format=png&auto=webp&s=b0478e7ea600140253b9dc53066210412967d4cd
Pharmaceutical industry (including CROs)
Eleven respondents (4.3%) reported working in the pharmaceutical industry. The breakdown by department is shown in the table below.
Department Number of Respondents
Drug Safety and Risk Management/Pharmacovigilance 2
Medical Communications/Education/Information 2
Regulatory Affairs 2
Clinical Pharmacology/Pharmacokinetics 1
Clinical Research & Development (including Clinical Operations) 1
Formulation 1
Marketing/Business Analytics 1
Medical Science Liaison 1
The breakdown by level was as follows: PharmD Fellow (3; 27.3%), Associate/Specialist (6; 54.5%), ManageSupervisor (1; 9.1%), Director (1; 9.1%). Five respondents had completed or were currently completing a fellowship. Four of these 5 provided their salaries during their fellowships, with an average of $50,000.
Pharmacy and pre-pharmacy students
There were 63 respondents (24.4%) who reported being pharmacy or pre-pharmacy students. Of these, the top 3 desired fields upon graduation were: hospital including residencies (16; 25.4%), undecided (13; 20.6%), and community/retail (11; 17.5%).
These 63 students attended (or planned to attend) 45 different schools worldwide. The 5 most common schools reported were as follows: University of Toronto (3; 4.8%), Feik School of Pharmacy (2; 3.2%), Ohio State University (2; 3.2%), Temple University (2; 3.2%), and University of Colorado (2; 3.2%).
The breakdown by year was as follows: undergraduate/pre-pharmacy (8; 12.7%), PY1 (4; 6.3%), PY2 (18; 28.6%), PY3 (16; 25.4%), and PY4 (13; 20.6%). Of the 13 PY4 students, 2 reported having a job lined up after graduation, both in community/retail.
Most students (45; 71.4%) were working in a pharmacy setting while in school. Stats for the number of hours worked weekly were as follows: minimum 3; maximum 34; mean 15.8; median 15. The most common duties interns were authorized to perform at their jobs were counseling patients (38; 84.4%), administering immunizations (24; 53.3%), and product verification (17; 37.8%). Note that interns could choose more than 1 option.
Of the 63 students, 36 (57.1%) reported that they would choose to attend pharmacy school again if they could go back in time, knowing what they know now. Sixteen students (25.4%) reported that they would decide on a different career path, and 5 (7.9%) were unsure.
Following pharmacy school, some students were considering pursuing the following degrees (top 3 listed): MPH (6; 9.5%), MD (4; 6.3%), and MBA (3; 4.8%).
Results from additional questions are shown in chart form below.
https://preview.redd.it/mls7e2139ac51.png?width=480&format=png&auto=webp&s=5db3ec80fd6e1934c787941278b7b755ad802a45
https://preview.redd.it/p9p44ifm9ac51.png?width=480&format=png&auto=webp&s=faf04b54ed228cc0cf110d06ed27bfd524ba894f
https://preview.redd.it/8p7qq205aac51.png?width=464&format=png&auto=webp&s=ae5d53c284cd86ff787498dad58c4d625ae2afb1
Pharmacists
There were 169 pharmacists, from 91 different pharmacy schools. The most common alma maters were Rutgers University Ernest Mario School of Pharmacy (RU RAH RAH!!) with 6 respondents (3.6%), University of Pittsburgh with 5 respondents (3.0%), and the following 5 schools with 4 respondents each: Northeastern University, Ohio Northern University, University of Colorado, University of Georgia, and University of Kansas.
Most pharmacists (152; 89.9%) were currently practicing pharmacy. Five (3.0%) had practiced in the past but were no longer practicing, and 10 (5.9%) had never practiced after graduating. Of those currently practicing pharmacy, the statistics on the number of years in practice were as follows: minimum 0.1; maximum 35; mean 4.8; and median 3.
Nearly half of pharmacists (75; 49.3%) said they would choose a different career path if they could go back in time, knowing what they know now, while 71 pharmacists (46.7%) said they would still choose to pursue pharmacy.
Local practice standards
About half of pharmacists (84; 55.3%) reported administering (or being allowed to administer) many types of immunizations, while 3 (2.0%) reported that pharmacists were not allowed in their location. A further 63 pharmacists (41.4%) did not administer immunizations simply because it was not part of their job description (eg, hospital inpatient).
Regarding therapeutic interchange for non-controlled prescriptions, 63 pharmacists (41.4%) reporting being authorized to update a prescription only after consulting the prescriber. An additional 43 pharmacists (28.3%) were allowed to update a prescription as long as the prescriber was notified afterwards (ie, without prior permission), and 8 pharmacists (5.3%) were allowed per institutional protocol or collaborative practice agreement. Twenty-four pharmacists (15.8%) reported that a new prescription would be required and that no updates by the pharmacist were allowed.
For controlled prescriptions, 24 pharmacists (15.8%) reported being allowed to change any/all elements of the prescription following consultation with the prescriber, and 4 pharmacists (2.6%) were allowed per institutional protocol or collaborative practice agreement. Sixty-six pharmacists (43.4%) were allowed to change certain (but not all) elements, while 40 (26.3%) could not change any part of a controlled prescription and required the prescriber to issue a new one.
Regarding pharmacist prescribing, most pharmacists (110; 72.4%) were not allowed to prescribe medications. Nineteen pharmacists (12.5%) could prescribe for certain health conditions, 3 (2.0%) could prescribe for any health condition, and 2 (1.3%) could prescribe per institutional protocol or collaborative practice agreement.
Results from additional questions are shown in chart form below.
https://preview.redd.it/9q4wjmmg3bc51.png?width=281&format=png&auto=webp&s=cf2ec43db13f3fcbe4cb398b1c39808389f54572
https://preview.redd.it/945u7beklac51.png?width=480&format=png&auto=webp&s=e74267ca8c2d56dd0c7fc42497df2f0d42f14a3a
https://preview.redd.it/yyd7su4tlac51.png?width=480&format=png&auto=webp&s=86e12e31c5de3b91a615add5dd28055f881beddc
https://preview.redd.it/tk2msh41mac51.png?width=480&format=png&auto=webp&s=c091747118370117d3ecf35a8e9bffd54ac02805
https://preview.redd.it/9njkd9vemac51.png?width=346&format=png&auto=webp&s=ffe54bfc9ae206295f7e81685a361357c14a625a
https://preview.redd.it/mywjx5nwmac51.png?width=444&format=png&auto=webp&s=1eb695e764c2bf7c1ffbfddd947fc297eed4f8ea
Pharmacy residents
Of the 169 pharmacists, 31 (18.3%) had completed or were currently completing a pharmacy residency. Of those, there were 6 current PGY-1 residents and 1 current PGY-2 resident.
Of the 24 pharmacists who had completed their PGY-1 residencies, most (18; 75%) did rotational programs without a specific focus. The remaining 6 pharmacists specialized in the following areas during their PGY-1: ambulatory care (2; 8.3%), community pharmacy (1; 4.2%), managed care (1; 4.2%), pediatrics (1; 4.2%), and pharmacotherapy (1; 4.2%). Stats on their PGY-1 salaries were as follows: minimum $33,000; maximum $60,000; mean $44,325; median $45,000. These PGY-1 residencies were done primarily in an urban setting (18; 75%), followed by suburban (3; 12.5%) and rural (2; 8.3%).
Of the 11 pharmacists who had completed their PGY-2 residencies, the specialties included: ambulatory care (3; 27.3%), psychiatry (2; 18.2%), and 1 each of administration, critical care, emergency medicine, infectious disease, oncology, and pharmacotherapy (9.1% each). Stats on their PGY-2 salaries were as follows: minimum $35,000; maximum $51,000; mean $45,625; median $46,500. These PGY-2 residencies were done almost equally in urban (6; 54.5%) and suburban (5; 45.5%) settings.
The 6 current PGY-1 residents had the following plans immediately following their PGY-1: inpatient staff pharmacist (2; 33.3%), PGY-2 residency (2; 33.3%), inpatient clinical specialty pharmacist (1; 16.7%), and non-practicing pharmacist (1; 16.7%).
Of those who had completed their residencies, their roles immediately afterward are listed in the table below.
Role Number of Respondents
Inpatient staff pharmacist 8
Inpatient clinical specialty pharmacist 6
Ambulatory care pharmacist 4
Unemployed 2
Outpatient pharmacist (eg, retail, mail order, long term care) 1
Stopped practicing but remained in the field of pharmacy (eg, industry) 1
Industry fellowship 1
Drug information pharmacist 1
Pharmacy organizations
This question was directed toward American respondents. There were 96 respondents who reported being currently active members of an association, the most common of which were ASHP (39; 40.6%), APhA (38; 39.6%), and a local/state pharmacy association (29; 30.2%).
There were 35 respondents who reported previously being members of an association, the most common of which were APhA (25; 71.4%), ASHP (15; 42.9%), and a local/state pharmacy association (13; 37.1%).
Final comments
Thanks again to everyone who took the survey, and especially those who provided feedback!
I totally acknowledge that the survey is very US-centric, and for that I apologize. I did take some feedback from some people in this subreddit, but if anyone ex-US wants to provide feedback for any future surveys, I'm happy to speak with you offline about it.
The same also goes for anyone in a "niche" field such as long-term care, ambulatory care, managed care, etc. I'm happy to add in new sections or questions for those fields - it's just that I have no idea what to ask, having no experience in those areas.
There are probably a few questions whose answers aren't reflected here mainly because this is long enough already, but if you have any questions (eg, what's the average salary for a hospital pharmacist in a suburban area?), please feel free to ask!
Thanks again!
submitted by fleakered to pharmacy [link] [comments]

crash help

dcs has been cashing for months, used to never crash before :/

=== Log opened UTC 2020-10-11 14:18:20
2020-10-11 14:18:20.217 INFO DCS: Command line: D:\SteamLibrary\steamapps\common\DCSWorld\bin\DCS.exe
2020-10-11 14:18:20.217 INFO DCS: DCS/2.5.6.55960 (x86_64; Windows NT 10.0.18363)
2020-10-11 14:18:20.217 INFO DCS: DCS revision: 175957
2020-10-11 14:18:20.217 INFO DCS: Renderer revision: 20900
2020-10-11 14:18:20.217 INFO DCS: Terrain revision: 20771
2020-10-11 14:18:20.217 INFO DCS: CPU cores: 6, threads: 12, System RAM: 16333 MB, Pagefile: 9728 MB
2020-10-11 14:18:20.574 INFO EDCORE: (dDispatcher)enterToState_:0
2020-10-11 14:18:21.012 INFO Dispatcher: 2020/10/11 10:18 V1803061700
2020-10-11 14:18:21.081 INFO INPUT: Device [Keyboard] created deviceId = -1
2020-10-11 14:18:21.102 INFO INPUT: Device [Joystick - HOTAS Warthog {9BEE9130-944C-11ea-8002-444553540000}] created deviceId = -1
2020-10-11 14:18:21.102 INFO INPUT: Joystick created[Joystick - HOTAS Warthog {9BEE9130-944C-11ea-8002-444553540000}], ForceFeedBack: no
2020-10-11 14:18:21.151 INFO INPUT: Device [Throttle - HOTAS Warthog {7F0A2CA0-944C-11ea-8001-444553540000}] created deviceId = -1
2020-10-11 14:18:21.151 INFO INPUT: Joystick created[Throttle - HOTAS Warthog {7F0A2CA0-944C-11ea-8001-444553540000}], ForceFeedBack: no
2020-10-11 14:18:21.151 INFO INPUT: Device [Mouse] created deviceId = -1
2020-10-11 14:18:21.229 INFO SOUND: Using driver: wasapi
2020-10-11 14:18:21.232 INFO SOUND: Found 4 available audio device(s):
2020-10-11 14:18:21.232 INFO SOUND: 0: ID: "{0.0.0.00000000}.{a3bc94c3-9243-4999-94ca-cf14d01bfd63}", Name: "VG248 (3- NVIDIA High Definition Audio)"
2020-10-11 14:18:21.232 INFO SOUND: 1: ID: "{0.0.0.00000000}.{b9b8fde4-1f79-4ade-bbaa-bf1f2d844bc0}", Name: "Headphones (Oculus Virtual Audio Device)"
2020-10-11 14:18:21.232 INFO SOUND: 2: ID: "{0.0.0.00000000}.{e7804ff2-fc95-4bbc-9ba2-0e82dab295c7}", Name: "VE228 (3- NVIDIA High Definition Audio)"
2020-10-11 14:18:21.232 INFO SOUND: 3: ID: "{0.0.0.00000000}.{eefc5f93-4c6c-44cd-98cc-74f502409cb3}", Name: "Speakers (2- Yeti Stereo Microphone)"
2020-10-11 14:18:25.940 INFO SOUND: Added sound path: Sounds
2020-10-11 14:18:26.057 INFO SOUNDER: Loaded 207 packed sounders.
2020-10-11 14:18:26.057 INFO Dispatcher: InitLow
2020-10-11 14:18:28.003 INFO NET: Got Steam auth data.
2020-10-11 14:18:28.267 INFO DCS: Successfully got Steam authorization data.
2020-10-11 14:18:28.267 INFO Dispatcher: Loading installed modules...
2020-10-11 14:18:41.794 INFO Scripting: OBSOLETE mount_vfs_sound_path() used in ./CoreMods/aircraft/AJS37/entry.lua
2020-10-11 14:18:41.987 INFO Scripting: PLUGINS START-------------------------------------------------
2020-10-11 14:18:42.289 INFO SOUND: Added sound path: ./CoreMods/tech/TechWeaponPack/Sounds
2020-10-11 14:18:42.306 INFO SOUNDER: Loaded 3 packed sounders.
2020-10-11 14:18:42.734 INFO SOUND: Added sound path: ./CoreMods/tech/USS John C Stennis/Sounds
2020-10-11 14:18:42.735 INFO SOUNDER: Loaded 1 packed sounders.
2020-10-11 14:18:42.762 INFO SOUND: Added sound path: ./CoreMods/tech/USS_Nimitz/Sounds
2020-10-11 14:18:42.776 INFO SOUNDER: Loaded 3 packed sounders.
2020-10-11 14:18:43.542 INFO SOUND: Added sound path: ./CoreMods/aircraft/AJS37/Sounds
2020-10-11 14:18:43.661 INFO SOUND: Added sound path: ./CoreMods/aircraft/AV8BNA/Sounds
2020-10-11 14:18:43.669 INFO SOUNDER: Loaded 1 sounders.
2020-10-11 14:18:43.672 WARNING EDCORE: Source ./CoreMods/aircraft/AV8BNA/Shapes is already mounted to the same mount /models/.
2020-10-11 14:18:43.783 INFO SOUND: Added sound path: ./CoreMods/aircraft/C-101/Sounds
2020-10-11 14:18:44.163 INFO SOUND: Added sound path: ./CoreMods/aircraft/ChinaAssetPack/Sounds
2020-10-11 14:18:44.210 INFO SOUNDER: Loaded 11 sounders.
2020-10-11 14:18:44.487 INFO SOUND: Added sound path: ./CoreMods/aircraft/Christen Eagle II/Sounds
2020-10-11 14:18:46.253 INFO SOUND: Added sound path: ./CoreMods/aircraft/F14/Sounds
2020-10-11 14:18:46.302 INFO SOUNDER: Loaded 1 sounders.
2020-10-11 14:18:46.982 INFO SOUND: Added sound path: ./CoreMods/aircraft/Hawk/Sounds
2020-10-11 14:18:47.256 INFO SOUND: Added sound path: ./CoreMods/aircraft/I-16/Sounds
2020-10-11 14:18:47.502 INFO SOUND: Added sound path: ./CoreMods/aircraft/M-2000C/Sounds
2020-10-11 14:18:47.832 INFO SOUND: Added sound path: ./CoreMods/aircraft/MiG-21bis/Sounds
2020-10-11 14:18:47.992 INFO SOUND: Added sound path: ./CoreMods/aircraft/SA342/Sounds
2020-10-11 14:18:48.183 INFO SOUND: Added sound path: ./Mods/aircraft/F-15C/Sounds
2020-10-11 14:18:48.318 INFO SOUND: Added sound path: ./Mods/aircraft/F-16C/Sounds
2020-10-11 14:18:48.527 INFO SOUND: Added sound path: ./Mods/aircraft/FA-18C/Sounds
2020-10-11 14:18:48.667 INFO SOUND: Added sound path: ./Mods/aircraft/TF-51D/Sounds
2020-10-11 14:18:48.717 INFO SOUND: Added sound path: ./Mods/terrains/PersianGulf/Sounds
2020-10-11 14:18:48.782 INFO Scripting: PLUGINS DONE--------------------------------------------------
2020-10-11 14:18:49.315 INFO SECURITYCONTROL: Loaded ./CoreMods/aircraft/AV8BNA/bin/AV8B_AI.dll
2020-10-11 14:18:49.318 INFO SECURITYCONTROL: Loaded ./CoreMods/aircraft/C-101/bin/C101Core.dll
2020-10-11 14:18:49.322 INFO SECURITYCONTROL: Loaded ./CoreMods/aircraft/ChinaAssetPack/bin/CAP_AI.dll
2020-10-11 14:18:49.326 INFO SECURITYCONTROL: Loaded ./CoreMods/aircraft/ChinaAssetPack/bin/JF-17_AI.dll
2020-10-11 14:18:49.329 INFO CE2_AI: Loading CE AI
2020-10-11 14:18:49.329 INFO SECURITYCONTROL: Loaded ./CoreMods/aircraft/Christen Eagle II/bin/CE2_AI.dll
2020-10-11 14:18:49.333 INFO SECURITYCONTROL: Loaded ./CoreMods/aircraft/F14/bin/F14-AI.dll
2020-10-11 14:18:49.337 INFO MiG21_AI: Loading CE AI
2020-10-11 14:18:49.337 INFO SECURITYCONTROL: Loaded ./CoreMods/aircraft/MiG-21bis/bin/MiG21_AI.dll
2020-10-11 14:18:49.345 INFO SECURITYCONTROL: Loaded ./CoreMods/services/CaptoGlove/bin/edCaptoGlove.dll
2020-10-11 14:18:49.401 INFO SECURITYCONTROL: Loaded ./CoreMods/services/VoiceChat/bin/VoiceChat.dll
2020-10-11 14:18:49.802 INFO SECURITYCONTROL: Loaded ./Mods/aircraft/jsAvionics/bin/jsAvionics.dll
2020-10-11 14:18:52.059 INFO DCS: options.graphics =
{
\['messagesFontScale'\] = 1; \['rainDroplets'\] = true; \['LensEffects'\] = 0; \['heatBlr'\] = 1; \['anisotropy'\] = 3; \['water'\] = 2; \['motionBlur'\] = 0; \['outputGamma'\] = 2.2; \['treesVisibility'\] = 10000; \['aspect'\] = 1.7777777777778; \['lights'\] = 2; \['shadows'\] = 4; \['MSAA'\] = 1; \['SSAA'\] = 0; \['civTraffic'\] = 'high'; \['clutterMaxDistance'\] = 1000; \['cockpitGI'\] = 1; \['terrainTextures'\] = 'max'; \['multiMonitorSetup'\] = '1camera'; \['shadowTree'\] = false; \['chimneySmokeDensity'\] = 5; \['fullScreen'\] = true; \['DOF'\] = 0; \['clouds'\] = 1; \['forestDistanceFactor'\] = 1; \['flatTerrainShadows'\] = 0; \['width'\] = 1920; \['visibRange'\] = 'High'; \['SSLR'\] = 1; \['effects'\] = 3; \['SSAO'\] = 0; \['useDeferredShading'\] = 1; \['sync'\] = true; \['textures'\] = 2; \['scaleGui'\] = true; \['preloadRadius'\] = 60000; \['height'\] = 1080; \['terrainMapMFD'\] = { \['distance'\] = { \['mapLodDistance3'\] = 200000; \['mapLodDistance2'\] = 100000; \['mapLodDistance0'\] = 25000; \['mapLodDistance1'\] = 50000; }; }; 
};
2020-10-11 14:18:56.924 INFO GRAPHICSVISTA: renderer: 'dx11backend.dll'
2020-10-11 14:18:59.187 INFO DX11BACKEND: DX11Renderer initialization (w:1920 h:1080 fullscrn:1 vsync:0 adapter:0 monitor:1 shaderErrors:1)
2020-10-11 14:18:59.535 INFO DX11BACKEND: Driver Concurrent Creates - 1
2020-10-11 14:18:59.535 INFO DX11BACKEND: Driver Command Lists - 1
2020-10-11 14:18:59.654 INFO DX11BACKEND: NVIDIA API init OK
2020-10-11 14:18:59.660 INFO DX11BACKEND: NVIDIA Display Driver Version 45206.r452_02
2020-10-11 14:18:59.660 INFO DX11BACKEND: GPU count:1
2020-10-11 14:18:59.835 INFO DX11BACKEND: commonPool: 68-128, cbPool: 4-14, samplerPool: 5-16
2020-10-11 14:19:00.202 INFO DX11BACKEND: DX11ShaderBinaries::loadCache Bazashaders/fxo/fxo.zip
2020-10-11 14:19:05.330 ERROR EDCORE: Can't open file 'bazashaders/posteffects/slot.fx.' from fs
2020-10-11 14:19:05.330 WARNING DX11BACKEND: Shader posteffects/slot.fx:DIRECTX11=true;USE_DCS_DEFERRED=1; is outdated as file bazashaders/posteffects/slot.fx is modified.
2020-10-11 14:19:13.699 INFO DX11BACKEND: DX11ShaderBinaries::loadCache done. Loaded 1118/1119.
2020-10-11 14:19:13.753 INFO DX11BACKEND: DX11ShaderBinaries::loadCache C:\Users\ApBoy\Saved Games\DCS\fxo
2020-10-11 14:19:14.086 INFO DX11BACKEND: DX11ShaderBinaries::loadCache done. Loaded 62/62.
2020-10-11 14:19:14.509 INFO VISUALIZER: SceneManager initialization
2020-10-11 14:19:15.669 INFO VISUALIZER: cascade shadows init (preset:'default' quality:4 layers:4 size:4096)
2020-10-11 14:19:16.500 INFO SOUND: Opening default audio device.
2020-10-11 14:19:16.508 INFO SOUND: Driver reports 2 channels with mask 0x3
2020-10-11 14:19:16.508 INFO SOUND: Using 2 channels at 48000 Hz
2020-10-11 14:19:16.508 INFO SOUND: Channel layout: Headphones/Stereo
2020-10-11 14:19:16.859 INFO DCS: gDescription: "NVIDIA GeForce RTX 2070" gVendorId: 4318 gDeviceId: 7943 gMemory: 8031 MB
2020-10-11 14:19:16.983 ERROR EDOBJECTS: Destruction shape not found AVIASHTAB_CRASH
2020-10-11 14:19:16.985 ERROR EDOBJECTS: Object HB_F14_EXT_PHX_ALU with id=320 already declared in table MissileTable
2020-10-11 14:19:21.202 INFO EDCORE: (dDispatcher)enterToState_:1
2020-10-11 14:19:32.406 INFO UIBASERENDERER: Cannot load font [D:\SteamLibrary\steamapps\common\DCSWorld\dxgui\skins\fonts\]!
2020-10-11 14:19:38.362 INFO Dispatcher: //=== END OF INIT ======================================//
2020-10-11 14:19:38.362 INFO EDCORE: (dDispatcher)enterToState_:2
2020-10-11 14:19:38.463 INFO EDCORE: (dDispatcher)enterToState_:3
2020-10-11 14:19:38.463 INFO Lua: Lua CPU usage: metric: average game loading: 47.8552 %
2020-10-11 14:19:38.474 INFO DCS: Screen: MainMenu
2020-10-11 14:19:42.706 INFO NET: Login success.
2020-10-11 14:20:12.735 INFO SOUND: Using SSSE3 for peak calc.
2020-10-11 14:20:12.957 INFO DCS: Screen: Mission
2020-10-11 14:21:21.851 INFO DCS: Screen: MainMenu
2020-10-11 14:21:28.161 INFO DCS: Screen: FastMission
2020-10-11 14:21:37.690 INFO DCS: Screen: FastMissionAdvanced
2020-10-11 14:22:10.929 INFO DCS: Screen: MainMenu
2020-10-11 14:22:10.933 INFO DCS: Screen: FastMission
2020-10-11 14:22:29.263 INFO LUA-TERRAIN: Init('./Mods/terrains/Caucasus\terrain.cfg.lua')
2020-10-11 14:22:29.277 INFO TERRAIN: lSystem::load(./Mods/terrains/Caucasus\terrain.cfg.lua)
2020-10-11 14:22:29.277 INFO TERRAIN: lSystem::CleanScenes()
2020-10-11 14:22:29.277 INFO TERRAIN: lSystem::exit()
2020-10-11 14:22:29.277 INFO VISUALIZER: TerrainRenderer::release
2020-10-11 14:22:29.277 INFO TERRAIN: lSystem::CleanScenes()
2020-10-11 14:22:29.277 INFO EDOBJECTS: lTypeObjectManagerImpl::clear 13
2020-10-11 14:22:30.151 INFO EDTERRAIN4: TERRAIN_REVISION = 3634
2020-10-11 14:22:30.152 INFO EDTERRAIN4: Open terrain cfg "./Mods/terrains/Caucasus\terrain.cfg.lua"
2020-10-11 14:22:30.152 INFO EDTERRAIN4: Build date:
2020-10-11 14:22:30.152 INFO EDTERRAIN4: Texture quality: "max"
2020-10-11 14:22:31.277 INFO EDCORE: 1.072 s terrain references
2020-10-11 14:22:31.280 INFO EDCORE: 0.000 s terrain models
2020-10-11 14:22:31.284 INFO EDCORE: 0.000 s terrain assets
2020-10-11 14:22:31.519 INFO EDCORE: 0.232 s terrain assets2
2020-10-11 14:22:31.789 INFO EDCORE: 0.266 s terrain landfile
2020-10-11 14:22:31.807 INFO EDCORE: 0.014 s terrain surfaceTile
2020-10-11 14:22:31.812 INFO EDTERRAIN4: switchTextures (season=1, minTexture=false)
2020-10-11 14:22:33.298 INFO EDCORE: 1.487 s terrain vfstextures
2020-10-11 14:22:33.420 INFO EDCORE: 0.118 s terrain scene
2020-10-11 14:22:33.423 INFO EDCORE: 0.000 s terrain roaddetails
2020-10-11 14:22:33.427 INFO EDCORE: 0.000 s terrain forest
2020-10-11 14:22:40.255 INFO EDCORE: 6.824 s terrain routes
2020-10-11 14:22:40.260 INFO EDCORE: 0.000 s terrain surfacedetails
2020-10-11 14:22:40.263 INFO EDCORE: 0.000 s terrain blocks
2020-10-11 14:22:40.297 INFO EDCORE: 0.000 s terrain references
2020-10-11 14:22:40.588 INFO EDCORE: 0.322 s terrain superficials
2020-10-11 14:22:40.628 INFO EDCORE: 0.035 s terrain lma
2020-10-11 14:22:40.632 INFO EDCORE: 0.000 s terrain vectordatasettings
2020-10-11 14:22:42.496 INFO EDCORE: 1.860 s terrain navgraph
2020-10-11 14:22:42.544 INFO EDCORE: 0.043 s terrain vti
2020-10-11 14:22:43.670 INFO EDTERRAIN4: terrain time: 13.517271 s
2020-10-11 14:22:43.896 INFO TERRAIN: lSystem::InitScenes()
2020-10-11 14:22:43.896 INFO EDTERRAIN4: lTerraDispatchImpl::setDate( day=22, month=6)
2020-10-11 14:22:43.896 INFO EDTERRAIN4: switchTextures (season=1, minTexture=false)
2020-10-11 14:22:43.896 INFO DX11BACKEND: Reloading textures ...
2020-10-11 14:22:43.903 INFO LUA-TERRAIN: Init done
2020-10-11 14:22:44.032 INFO EDTERRAINGRAPHICS41: ITerrainGraphicsImpl4::openTerrain() START
2020-10-11 14:22:44.045 INFO EDTERRAINGRAPHICS41: loading terrain options: Config/terrain/terrainoptions41.lua
2020-10-11 14:22:44.059 INFO EDTERRAINGRAPHICS41: loading terrain shading options: ./Mods/terrains/Caucasus/shadingOptions/Caucasus.lua
2020-10-11 14:22:44.100 INFO METASHADER: MetaShaderCache2: [108] from ./Mods/terrains/Caucasus/misc/metacache/dcs.lua
2020-10-11 14:22:47.790 INFO METASHADER: loaded [108/108]
2020-10-11 14:22:47.890 WARNING EDTERRAINGRAPHICS41: failed to open "surfaceDetailsHints" section in terrain config
2020-10-11 14:22:47.995 INFO EDTERRAINGRAPHICS41: 0.000010s Loaded reference file "caucasus" buffers: 0
2020-10-11 14:22:47.995 INFO EDCORE: 0.000 s landscape4::lReferenceFile
2020-10-11 14:22:48.073 INFO EDTERRAINGRAPHICS41: 0.073605s Loaded reference file "airfieldslights" buffers: 7
2020-10-11 14:22:48.073 INFO EDCORE: 0.074 s landscape4::lReferenceFile
2020-10-11 14:22:48.655 INFO EDTERRAINGRAPHICS41: 0.578093s Loaded reference file "blockbuildings" buffers: 7
2020-10-11 14:22:48.655 INFO EDCORE: 0.578 s landscape4::lReferenceFile
2020-10-11 14:22:49.225 INFO EDTERRAINGRAPHICS41: 0.565400s Loaded reference file "blocktrees" buffers: 0
2020-10-11 14:22:49.225 INFO EDCORE: 0.565 s landscape4::lReferenceFile
2020-10-11 14:22:49.623 INFO EDTERRAINGRAPHICS41: 0.393164s Loaded reference file "caucasusafbbuildings_new" buffers: 27
2020-10-11 14:22:49.623 INFO EDCORE: 0.393 s landscape4::lReferenceFile
2020-10-11 14:22:49.639 INFO EDTERRAINGRAPHICS41: 0.012748s Loaded reference file "caucasusbridges" buffers: 11
2020-10-11 14:22:49.639 INFO EDCORE: 0.013 s landscape4::lReferenceFile
2020-10-11 14:22:49.697 INFO EDTERRAINGRAPHICS41: 0.053496s Loaded reference file "caucasusobjects" buffers: 0
2020-10-11 14:22:49.697 INFO EDCORE: 0.054 s landscape4::lReferenceFile
2020-10-11 14:22:49.711 INFO EDTERRAINGRAPHICS41: 0.008986s Loaded reference file "communication" buffers: 7
2020-10-11 14:22:49.711 INFO EDCORE: 0.009 s landscape4::lReferenceFile
2020-10-11 14:22:49.716 INFO EDTERRAINGRAPHICS41: 0.000351s Loaded reference file "crashmodels" buffers: 1
2020-10-11 14:22:49.716 INFO EDCORE: 0.000 s landscape4::lReferenceFile
2020-10-11 14:22:49.807 INFO EDTERRAINGRAPHICS41: 0.085697s Loaded reference file "crashmodelsfromedm" buffers: 22
2020-10-11 14:22:49.807 INFO EDCORE: 0.086 s landscape4::lReferenceFile
2020-10-11 14:22:49.837 INFO EDTERRAINGRAPHICS41: 0.025959s Loaded reference file "housedetails" buffers: 9
2020-10-11 14:22:49.837 INFO EDCORE: 0.026 s landscape4::lReferenceFile
2020-10-11 14:22:49.844 INFO EDTERRAINGRAPHICS41: 0.003396s Loaded reference file "industrial" buffers: 7
2020-10-11 14:22:49.844 INFO EDCORE: 0.003 s landscape4::lReferenceFile
2020-10-11 14:22:49.875 INFO EDTERRAINGRAPHICS41: 0.026320s Loaded reference file "misc" buffers: 3
2020-10-11 14:22:49.875 INFO EDCORE: 0.026 s landscape4::lReferenceFile
2020-10-11 14:22:49.886 INFO EDTERRAINGRAPHICS41: 0.006227s Loaded reference file "oilplatforms" buffers: 7
2020-10-11 14:22:49.886 INFO EDCORE: 0.006 s landscape4::lReferenceFile
2020-10-11 14:22:49.896 INFO EDTERRAINGRAPHICS41: 0.006233s Loaded reference file "tuapserefinery" buffers: 6
2020-10-11 14:22:49.896 INFO EDCORE: 0.006 s landscape4::lReferenceFile
2020-10-11 14:22:49.905 INFO EDTERRAINGRAPHICS41: 0.004649s Loaded reference file "vehicles" buffers: 9
2020-10-11 14:22:49.905 INFO EDCORE: 0.005 s landscape4::lReferenceFile
2020-10-11 14:22:49.914 INFO EDTERRAINGRAPHICS41: 0.004507s Loaded reference file "vessels" buffers: 7
2020-10-11 14:22:49.914 INFO EDCORE: 0.005 s landscape4::lReferenceFile
2020-10-11 14:22:49.926 INFO EDTERRAINGRAPHICS41: 0.008617s Loaded reference file "walls" buffers: 2
2020-10-11 14:22:49.926 INFO EDCORE: 0.009 s landscape4::lReferenceFile
2020-10-11 14:22:49.942 INFO EDTERRAINGRAPHICS41: 0.010335s Loaded reference file "americanbeech" buffers: 0
2020-10-11 14:22:49.942 INFO EDCORE: 0.010 s landscape4::lReferenceFile
2020-10-11 14:22:49.947 INFO EDTERRAINGRAPHICS41: 0.001066s Loaded reference file "caucasus_fir" buffers: 0
2020-10-11 14:22:49.947 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.952 INFO EDTERRAINGRAPHICS41: 0.000813s Loaded reference file "cypress_oak" buffers: 0
2020-10-11 14:22:49.952 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.957 INFO EDTERRAINGRAPHICS41: 0.000834s Loaded reference file "european_beech" buffers: 0
2020-10-11 14:22:49.957 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.961 INFO EDTERRAINGRAPHICS41: 0.000630s Loaded reference file "green_ash" buffers: 0
2020-10-11 14:22:49.961 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.966 INFO EDTERRAINGRAPHICS41: 0.000919s Loaded reference file "honey_mesquite" buffers: 0
2020-10-11 14:22:49.966 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.972 INFO EDTERRAINGRAPHICS41: 0.001254s Loaded reference file "italiancypress" buffers: 0
2020-10-11 14:22:49.972 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.979 INFO EDTERRAINGRAPHICS41: 0.001002s Loaded reference file "lombardypoplar" buffers: 0
2020-10-11 14:22:49.979 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.984 INFO EDTERRAINGRAPHICS41: 0.001451s Loaded reference file "mountain_maple" buffers: 0
2020-10-11 14:22:49.984 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.991 INFO EDTERRAINGRAPHICS41: 0.001008s Loaded reference file "norwayspruce" buffers: 0
2020-10-11 14:22:49.991 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.998 INFO EDTERRAINGRAPHICS41: 0.000759s Loaded reference file "shrub" buffers: 0
2020-10-11 14:22:49.998 INFO EDCORE: 0.001 s landscape4::lReferenceFile
2020-10-11 14:22:49.998 INFO EDTERRAINGRAPHICS41: InstanceManager2 is completing initialization...
2020-10-11 14:22:50.002 INFO EDTERRAINGRAPHICS41: InstanceManager2::updateObjectDeclBuffer()
2020-10-11 14:22:50.002 INFO EDTERRAINGRAPHICS41: InstanceManager2::updateObjectLodDeclBuffer()
2020-10-11 14:22:50.004 INFO EDTERRAINGRAPHICS41: InstanceManager2 complete initialization:
2020-10-11 14:22:50.004 INFO EDTERRAINGRAPHICS41: objects: 666
2020-10-11 14:22:50.004 INFO EDTERRAINGRAPHICS41: subObjects: 493
2020-10-11 14:22:50.004 INFO EDTERRAINGRAPHICS41: objectLods: 2911
2020-10-11 14:22:50.004 INFO EDTERRAINGRAPHICS41: objectLodRenderItems: 330
2020-10-11 14:22:50.004 INFO EDTERRAINGRAPHICS41: MAX_LODS_IN_OBJECT: 10
2020-10-11 14:22:50.004 INFO EDTERRAINGRAPHICS41: MAX_SUBOBJECTS_IN_OBJECT: 74
2020-10-11 14:22:50.004 INFO EDTERRAINGRAPHICS41: GEOMETRY BUFFERS: 132
2020-10-11 14:22:50.016 INFO EDCORE: 0.000 s landscape5::OceanFile
2020-10-11 14:22:51.136 INFO EDCORE: 1.115 s landscape5::GrassFile
2020-10-11 14:22:51.141 INFO EDCORE: 0.000 s landscape5::lModels5File
2020-10-11 14:22:51.401 WARNING LOG: 16 duplicate message(s) skipped.
2020-10-11 14:22:51.401 INFO EDTERRAINGRAPHICS41: remapper.deformers is not empty
2020-10-11 14:22:51.751 INFO EDTERRAINGRAPHICS41: materialParams[29084]*1536
2020-10-11 14:22:51.754 INFO EDCORE: 0.507 s landscape5::Surface5File
2020-10-11 14:22:51.784 INFO EDTERRAINGRAPHICS41: materialParams[810]*1536
2020-10-11 14:22:51.784 INFO EDCORE: 0.026 s landscape5::SurfaceTile
2020-10-11 14:22:51.790 INFO EDCORE: 0.001 s landscape5::Scene5File
2020-10-11 14:22:51.829 INFO EDCORE: 0.036 s landscape4::lRoutesFile
2020-10-11 14:22:51.867 INFO EDCORE: 0.034 s landscape4::lSurfaceDetails2File
2020-10-11 14:22:51.884 INFO EDTERRAINGRAPHICS41: materialParams[1]*1536
2020-10-11 14:22:51.884 INFO EDCORE: 0.013 s landscape4::lSuperficialFile
2020-10-11 14:22:51.908 INFO EDCORE: 0.020 s landscape4::lGeoNamesFile
2020-10-11 14:22:51.913 INFO EDCORE: 0.000 s landscape5::sup5File
2020-10-11 14:22:51.921 WARNING LOG: 1 duplicate message(s) skipped.
2020-10-11 14:22:51.921 INFO EDCORE: 0.000 s landscape5::navGraph5File
2020-10-11 14:22:51.921 INFO EDTERRAINGRAPHICS41: ITerrainGraphicsImpl4::openTerrain() END 7.889645 s
2020-10-11 14:22:51.989 ERROR_ONCE DX11BACKEND: render target 'mainDepthBuffer_copy' not found
2020-10-11 14:22:51.989 ERROR_ONCE DX11BACKEND: render target 'DummyShadowMap' not found
2020-10-11 14:23:26.265 ERROR EDCORE: can't open MO-file D:\SteamLibrary\steamapps\common\DCSWorld\l10n\cn\LC_MESSAGES\dcs.mo
2020-10-11 14:23:26.266 ERROR EDCORE: can't open MO-file D:\SteamLibrary\steamapps\common\DCSWorld\l10n\cs\LC_MESSAGES\dcs.mo
2020-10-11 14:23:26.266 ERROR EDCORE: can't open MO-file D:\SteamLibrary\steamapps\common\DCSWorld\l10n\de\LC_MESSAGES\dcs.mo
2020-10-11 14:23:26.267 ERROR EDCORE: can't open MO-file D:\SteamLibrary\steamapps\common\DCSWorld\l10n\es\LC_MESSAGES\dcs.mo
2020-10-11 14:23:26.268 ERROR EDCORE: can't open MO-file D:\SteamLibrary\steamapps\common\DCSWorld\l10n\fr\LC_MESSAGES\dcs.mo
2020-10-11 14:23:26.309 ERROR EDCORE: can't open MO-file D:\SteamLibrary\steamapps\common\DCSWorld\l10n\jp\LC_MESSAGES\dcs.mo
2020-10-11 14:23:26.345 ERROR EDCORE: can't open MO-file D:\SteamLibrary\steamapps\common\DCSWorld\l10n\pl\LC_MESSAGES\dcs.mo
2020-10-11 14:23:26.346 ERROR EDCORE: can't open MO-file D:\SteamLibrary\steamapps\common\DCSWorld\l10n\ru\LC_MESSAGES\dcs.mo
2020-10-11 14:23:41.607 INFO Dispatcher: loadMission C:\Users\ApBoy\AppData\Local\Temp\DCS\tempMission.miz
2020-10-11 14:23:41.607 INFO WORLDGENERAL: loading mission from: "C:\Users\ApBoy\AppData\Local\Temp\DCS\tempMission.miz"
2020-10-11 14:23:41.695 INFO EDCORE: (dDispatcher)enterToState_:4
2020-10-11 14:23:41.695 INFO Dispatcher: Terrain theatre Caucasus
2020-10-11 14:23:41.734 INFO Dispatcher: Start
2020-10-11 14:23:41.912 INFO Dispatcher: Terrain theatre Caucasus
2020-10-11 14:23:41.914 INFO TERRAIN: lSystem::load(./Mods/terrains/Caucasus\terrain.cfg.lua)
2020-10-11 14:23:41.914 INFO TERRAIN: lSystem::CleanScenes()
2020-10-11 14:23:41.914 INFO TERRAIN: lSystem::InitScenes()
2020-10-11 14:23:41.914 INFO EDTERRAIN4: lTerraDispatchImpl::setDate( day=18, month=11)
2020-10-11 14:23:41.914 INFO EDTERRAIN4: switch to season = autumn
2020-10-11 14:23:41.914 INFO EDTERRAIN4: switchTextures (season=2, minTexture=false)
2020-10-11 14:23:42.526 INFO DX11BACKEND: Reloading textures ...
2020-10-11 14:23:47.486 WARNING GRAPHICSCORE: already registered Renderer callback
2020-10-11 14:23:47.492 INFO VISUALIZER: StartSimulation
2020-10-11 14:23:47.993 INFO VISUALIZER: cascade shadows init (preset:'default' quality:4 layers:4 size:4096)
2020-10-11 14:23:48.869 ERROR_ONCE DX11BACKEND: texture 'lightning.dds' not found. Asked from 'EFFECTS2'
2020-10-11 14:23:50.385 ERROR_ONCE DX11BACKEND: texture 'LiquidAnimationAlphas' not found. Asked from 'EFFECTS2'
2020-10-11 14:23:50.385 ERROR_ONCE DX11BACKEND: texture 'LiquidAnimationNormals' not found. Asked from 'EFFECTS2'
2020-10-11 14:23:51.331 INFO Dispatcher: initial random seed = 6230979
2020-10-11 14:23:51.331 INFO Dispatcher: apply random seed = 6230979
2020-10-11 14:23:53.818 INFO WORLDGENERAL: loaded from mission Scripts/World/GPS_GNSS.lua
2020-10-11 14:23:55.957 INFO WORLDGENERAL: loaded from mission Config/View/SnapViewsDefault.lua
2020-10-11 14:23:55.965 INFO WORLDGENERAL: loaded from mission Config/View/View.lua
2020-10-11 14:23:55.975 INFO WORLDGENERAL: loaded from mission Config/View/Server.lua
2020-10-11 14:23:56.577 INFO Config: netview started
2020-10-11 14:23:57.705 INFO VISUALIZER: CameraIndependedPreload()
2020-10-11 14:23:57.705 INFO VISUALIZER: CameraIndependedPreload() finished
2020-10-11 14:23:57.705 INFO DCS: use_xRay: no
2020-10-11 14:24:17.625 INFO DCS: MissionSpawn:initScript
2020-10-11 14:24:17.625 INFO DCS: MissionSpawn:spawnCoalition red
2020-10-11 14:24:17.625 INFO DCS: MissionSpawn:spawnPlanes 18
2020-10-11 14:24:19.322 INFO DCS: MissionSpawn:spawnVehicles 24
2020-10-11 14:24:20.877 INFO GRAPHICSVISTA: Creating Resource "Unicode" of type 1
2020-10-11 14:24:21.800 INFO DCS: MissionSpawn:spawnVehicles 27
2020-10-11 14:24:21.803 INFO DCS: MissionSpawn:spawnVehicles 37
2020-10-11 14:24:22.260 INFO DCS: MissionSpawn:spawnPlanes 38
2020-10-11 14:24:22.318 INFO DCS: MissionSpawn:spawnHelicopters 0
2020-10-11 14:24:23.478 INFO DCS: MissionSpawn:spawnVehicles 0
2020-10-11 14:24:23.864 INFO DCS: MissionSpawn:spawnVehicles 43
2020-10-11 14:24:24.491 INFO DCS: MissionSpawn:spawnVehicles 19
2020-10-11 14:24:24.975 INFO DCS: MissionSpawn:spawnVehicles 47
2020-10-11 14:24:25.054 INFO DCS: MissionSpawn:spawnCoalition blue
2020-10-11 14:24:25.054 INFO DCS: MissionSpawn:spawnVehicles 16
2020-10-11 14:24:25.768 INFO DCS: MissionSpawn:spawnVehicles 15
2020-10-11 14:24:26.161 INFO DCS: MissionSpawn:spawnPlanes 9
2020-10-11 14:24:27.207 INFO DCS: MissionSpawn:spawnPlanes 2
2020-10-11 14:24:27.259 INFO DCS: MissionSpawn:spawnHelicopters 2
2020-10-11 14:24:29.025 INFO DCS: MissionSpawn:spawnVehicles 2
2020-10-11 14:24:31.391 INFO DCS: MissionSpawn:spawnLocalPlayer 62,F-16C_50
2020-10-11 14:24:31.973 INFO SECURITYCONTROL: Loaded ./Mods/aircraft/F-16C/bin/F16C.dll
2020-10-11 14:24:38.276 WARNING EDCORE: Source ./Mods/aircraft/F-16C/Cockpit/Scripts/../Shape is already mounted to the same mount /models/.
2020-10-11 14:24:38.757 INFO VISUALIZER: cockpit ILV loaded
2020-10-11 14:24:39.133 ERROR_ONCE DX11BACKEND: Can't load 'Fonts/font_general_RU.dds' in texture array ''.
2020-10-11 14:24:40.122 ERROR COCKPITBASE: devices_keeper::link_all: unable to find link target 'SMS' for device 'MAV_INTERFACE'
2020-10-11 14:24:41.652 ERROR COCKPITBASE: controller "SMS_AG_Harm_TER_TBL" missed in class cockpit::F16::ccMFD_F16
2020-10-11 14:24:44.257 WARNING LOG: 1 duplicate message(s) skipped.
2020-10-11 14:24:44.257 INFO COCKPITBASE: lua state still active DED, 26 (status undefined)
2020-10-11 14:24:44.257 INFO COCKPITBASE: lua state still active EHSI, 28 (status undefined)
2020-10-11 14:24:44.258 INFO COCKPITBASE: lua state still active UHF_RADIO, 36
2020-10-11 14:24:44.258 INFO COCKPITBASE: lua state still active VHF_RADIO, 38
2020-10-11 14:24:44.258 INFO COCKPITBASE: lua state still active INTERCOM, 39 (status undefined)
2020-10-11 14:24:44.259 INFO COCKPITBASE: lua state still active MACROS, 52 (status undefined)
2020-10-11 14:24:44.259 INFO COCKPITBASE: lua state still active TGP_INTERFACE, 58 (status undefined)
2020-10-11 14:24:44.666 ERROR SOUND: source_add(host:COCKPIT_MAIN, proto:Aircrafts/F-16/Cockpits/GearLockDown_In, alt_proto:Aircrafts//Cockpits/GearLockDown_In): can't find proto
2020-10-11 14:24:44.697 WARNING LOG: 1 duplicate message(s) skipped.
2020-10-11 14:24:44.697 INFO WORLDGENERAL: loaded from mission Scripts/World/birds.lua
2020-10-11 14:24:44.697 INFO DCS: dbox not found , skip
2020-10-11 14:24:46.709 INFO EDTERRAINGRAPHICS41: surface5 gc() LOD 0 0 squares
2020-10-11 14:24:46.709 INFO EDTERRAINGRAPHICS41: surface5 gc() LOD 1 0 squares
2020-10-11 14:24:46.709 INFO EDTERRAINGRAPHICS41: surface5 gc() 0.035400 ms
2020-10-11 14:24:46.813 INFO VISUALIZER: Preload() camera=15734.308756, 2023.594708, 233822.516476 radius=60000.000000
2020-10-11 14:24:46.813 INFO EDTERRAINGRAPHICS41: ITerrainGraphicsImpl4::forceLoading(): pos=(15734.3, 2023.59, 233823), radius=60000
2020-10-11 14:24:49.285 INFO EDCORE: try to write dump information
2020-10-11 14:24:49.287 INFO EDCORE: # -------------- 20201011-142449 --------------
2020-10-11 14:24:49.287 INFO EDCORE: DCS/2.5.6.55960 (x86_64; Windows NT 10.0.18363)
2020-10-11 14:24:49.289 INFO EDCORE: C:\WINDOWS\SYSTEM32\VCRUNTIME140.dll
2020-10-11 14:24:49.289 INFO EDCORE: # C0000005 ACCESS_VIOLATION at 10841637 00:00000000
2020-10-11 14:24:49.292 INFO EDCORE: SymInit: Symbol-SearchPath: '.', symOptions: 534, UserName: 'ApBoy'
2020-10-11 14:24:49.294 INFO EDCORE: OS-Version: 10.0.18363 () 0x300-0x1
2020-10-11 14:24:49.688 INFO EDCORE: 0x0000000000001637 (VCRUNTIME140): memcpy + 0x347
2020-10-11 14:24:49.688 INFO EDCORE: 0x000000000000988B (renderer): RenderAPI::openShader + 0x48B
2020-10-11 14:24:49.690 INFO EDCORE: 0x0000000000009280 (renderer): RenderAPI::openFromBlobShader + 0x180
2020-10-11 14:24:49.691 INFO EDCORE: 0x0000000000067D31 (metaShader): BaseBinder::render + 0x11431
2020-10-11 14:24:49.691 INFO EDCORE: 0x00000000000676AB (metaShader): BaseBinder::render + 0x10DAB
2020-10-11 14:24:49.691 INFO EDCORE: 0x000000000006BCCE (metaShader): BaseBinder::render + 0x153CE
2020-10-11 14:24:49.691 INFO EDCORE: 0x000000000000C624 (metaShader): render::MetaShaderManager::loadMetaShader + 0x174
2020-10-11 14:24:49.691 INFO EDCORE: 0x0000000000054101 (metaShader): RenderAPI::loadMetaShader + 0x41
2020-10-11 14:24:49.692 INFO EDCORE: 0x0000000000006604 (metaShader): render::MetaShader::open + 0x34
2020-10-11 14:24:49.692 INFO EDCORE: 0x00000000000990C8 (edterrainGraphics41): createInstancerRenderable + 0x6598
2020-10-11 14:24:49.693 INFO EDCORE: 0x000000000009B275 (edterrainGraphics41): createInstancerRenderable + 0x8745
2020-10-11 14:24:49.693 INFO EDCORE: 0x00000000000D9827 (edterrainGraphics41): CreateITerrainGraphics + 0x2227
2020-10-11 14:24:49.693 INFO EDCORE: 0x00000000000DAC19 (edterrainGraphics41): CreateITerrainGraphics + 0x3619
2020-10-11 14:24:49.693 INFO EDCORE: 0x00000000000F3B2D (edterrainGraphics41): edtg41::lMaterialParamsArrayGraphics::clear + 0xD5D
2020-10-11 14:24:49.693 INFO EDCORE: 0x0000000000063AE8 (edterrainGraphics41): edtg41::TerrainRenderable::dumpRenderItem + 0x531D8
2020-10-11 14:24:49.693 INFO EDCORE: 0x00000000000642EF (edterrainGraphics41): edtg41::TerrainRenderable::dumpRenderItem + 0x539DF
2020-10-11 14:24:49.694 INFO EDCORE: 0x0000000000063DF9 (edterrainGraphics41): edtg41::TerrainRenderable::dumpRenderItem + 0x534E9
2020-10-11 14:24:49.694 INFO EDCORE: 0x000000000006256E (edterrainGraphics41): edtg41::TerrainRenderable::dumpRenderItem + 0x51C5E
2020-10-11 14:24:49.694 INFO EDCORE: 0x0000000000060913 (edterrainGraphics41): edtg41::TerrainRenderable::dumpRenderItem + 0x50003
2020-10-11 14:24:49.694 INFO EDCORE: 0x0000000000060941 (edterrainGraphics41): edtg41::TerrainRenderable::dumpRenderItem + 0x50031
2020-10-11 14:24:49.695 WARNING LOG: 4 duplicate message(s) skipped.
2020-10-11 14:24:49.695 INFO EDCORE: 0x0000000000066220 (edterrainGraphics41): edtg41::TerrainRenderable::dumpRenderItem + 0x55910
2020-10-11 14:24:49.695 INFO EDCORE: 0x00000000000658FC (edterrainGraphics41): edtg41::TerrainRenderable::dumpRenderItem + 0x54FEC
2020-10-11 14:24:49.696 INFO EDCORE: 0x00000000000D0C15 (edterrainGraphics41): createInstancerRenderable + 0x3E0E5
2020-10-11 14:24:49.696 INFO EDCORE: 0x000000000011F30E (Visualizer): TerrainRenderer::forceLoading + 0x7E
2020-10-11 14:24:49.715 INFO EDCORE: 0x000000000014427E (Visualizer): smSceneManager::CreateSceneManager + 0x372E
2020-10-11 14:24:49.715 INFO EDCORE: 0x00000000007267A7 (DCS): CoreUtils::TempFilesManager::operator= + 0x3B8B77
2020-10-11 14:24:49.719 INFO EDCORE: 0x00000000007114D0 (DCS): CoreUtils::TempFilesManager::operator= + 0x3A38A0
2020-10-11 14:24:49.719 INFO EDCORE: 0x000000000011C27A (edCore): Common::FSM::sendOutputSymbol_ + 0x4A
2020-10-11 14:24:49.719 INFO EDCORE: 0x000000000011B912 (edCore): Common::FSM::enterToState_ + 0xC2
2020-10-11 14:24:49.719 INFO EDCORE: 0x000000000011C0F9 (edCore): Common::FSM::onSymbol_ + 0x1A9
2020-10-11 14:24:49.720 INFO EDCORE: 0x00000000007243A5 (DCS): CoreUtils::TempFilesManager::operator= + 0x3B6775
2020-10-11 14:24:49.720 INFO EDCORE: 0x00000000006F6604 (DCS): CoreUtils::TempFilesManager::operator= + 0x3889D4
2020-10-11 14:24:49.720 INFO EDCORE: 0x00000000006F69C9 (DCS): CoreUtils::TempFilesManager::operator= + 0x388D99
2020-10-11 14:24:49.720 INFO EDCORE: 0x00000000016E8CEB (DCS): AmdPowerXpressRequestHighPerformance + 0xB1ACE7
2020-10-11 14:24:49.722 INFO EDCORE: 0x00000000008E261E (DCS): CoreUtils::TempFilesManager::operator= + 0x5749EE
2020-10-11 14:24:49.724 INFO EDCORE: 0x0000000000017BD4 (KERNEL32): BaseThreadInitThunk + 0x14
2020-10-11 14:24:49.724 INFO EDCORE: 0x000000000006CE51 (ntdll): RtlUserThreadStart + 0x21
2020-10-11 14:24:52.510 INFO EDCORE: Minidump created.
2020-10-11 14:24:52.520 INFO DCS: enumerating loaded modules
=== Log closed.
submitted by buddy1225 to dcs [link] [comments]

TensroFlow2.0 CNN validation ValueError

I have a Conv-6 CNN inspired from VGG-19 for CIFAR-10 dataset which I am using with Data Augmentation using tf.Datagen flow() method. The code is as follows-


 # Data preprocessing and cleaning: # input image dimensions img_rows, img_cols = 32, 32 # Load CIFAR-10 dataset- (X_train, y_train), (X_test, y_test) = tf.keras.datasets.cifar10.load_data() print("X_train.shape = {0}, y_train.shape = {1}".format(X_train.shape, y_train.shape)) print("X_test.shape = {0}, y_test.shape = {1}".format(X_test.shape, y_test.shape)) # X_train.shape = (50000, 32, 32, 3), y_train.shape = (50000, 1) # X_test.shape = (10000, 32, 32, 3), y_test.shape = (10000, 1) if tf.keras.backend.image_data_format() == 'channels_first': X_train = X_train.reshape(X_train.shape[0], 3, img_rows, img_cols) X_test = X_test.reshape(X_test.shape[0], 3, img_rows, img_cols) input_shape = (3, img_rows, img_cols) else: X_train = X_train.reshape(X_train.shape[0], img_rows, img_cols, 3) X_test = X_test.reshape(X_test.shape[0], img_rows, img_cols, 3) input_shape = (img_rows, img_cols, 3) print("\n'input_shape' which will be used = {0}\n".format(input_shape)) # 'input_shape' which will be used = (32, 32, 3) # Convert datasets to floating point types- X_train = X_train.astype('float32') X_test = X_test.astype('float32') # Normalize the training and testing datasets- X_train /= 255.0 X_test /= 255.0 # convert class vectors/target to binary class matrices or one-hot encoded values- y_train = tf.keras.utils.to_categorical(y_train, num_classes) y_test = tf.keras.utils.to_categorical(y_test, num_classes) print("\nDimensions of training and testing sets are:") print("X_train.shape = {0}, y_train.shape = {1}".format(X_train.shape, y_train.shape)) print("X_test.shape = {0}, y_test.shape = {1}".format(X_test.shape, y_test.shape)) # Dimensions of training and testing sets are: # X_train.shape = (50000, 32, 32, 3), y_train.shape = (50000, 10) # X_test.shape = (10000, 32, 32, 3), y_test.shape = (10000, 10) train_dataset_features = tf.data.Dataset.from_tensor_slices(X_train) train_dataset_labels = tf.data.Dataset.from_tensor_slices(y_train) test_dataset_features = tf.data.Dataset.from_tensor_slices(X_test) test_dataset_labels = tf.data.Dataset.from_tensor_slices(y_test) # Choose an optimizer and loss function for training- loss_fn = tf.keras.losses.CategoricalCrossentropy() optimizer = tf.keras.optimizers.Adam(lr = 0.0003) # Select metrics to measure the error & accuracy of model. # These metrics accumulate the values over epochs and then # print the overall result- train_loss = tf.keras.metrics.Mean(name = 'train_loss') train_accuracy = tf.keras.metrics.CategoricalAccuracy(name = 'train_accuracy') test_loss = tf.keras.metrics.Mean(name = 'test_loss') test_accuracy = tf.keras.metrics.CategoricalAccuracy(name = 'test_accuracy') # Example of using 'tf.keras.preprocessing.image import ImageDataGenerator class's - flow(x, y)': datagen = ImageDataGenerator( # featurewise_center=True, # featurewise_std_normalization=True, rotation_range = 90, width_shift_range = 0.1, height_shift_range = 0.1, horizontal_flip = True ) def conv6_cnn(): """ Function to define the architecture of a neural network model following Conv-6 architecture for CIFAR-10 dataset and using provided parameter which are used to prune the model. Conv-6 architecture- 64, 64, pool -- convolutional layers 128, 128, pool -- convolutional layers 256, 256, pool -- convolutional layers 256, 256, 10 -- fully connected layers Output: Returns designed and compiled neural network model """ l = tf.keras.layers model = Sequential() model.add( Conv2D( filters = 64, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same', input_shape=(32, 32, 3) ) ) model.add( Conv2D( filters = 64, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( MaxPooling2D( pool_size = (2, 2), strides = (2, 2) ) ) model.add( Conv2D( filters = 128, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( Conv2D( filters = 128, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( MaxPooling2D( pool_size = (2, 2), strides = (2, 2) ) ) model.add( Conv2D( filters = 256, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( Conv2D( filters = 256, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( MaxPooling2D( pool_size = (2, 2), strides = (2, 2) ) ) model.add(Flatten()) model.add( Dense( units = 256, activation='relu', kernel_initializer = tf.initializers.GlorotNormal() ) ) model.add( Dense( units = 256, activation='relu', kernel_initializer = tf.initializers.GlorotNormal() ) ) model.add( Dense( units = 10, activation='softmax' ) ) # Compile pruned CNN- model.compile( loss=tf.keras.losses.categorical_crossentropy, # optimizer='adam', optimizer=tf.keras.optimizers.Adam(lr = 0.0003), metrics=['accuracy'] ) return model # Instantiate a new Conv-2 CNN model- orig_model = conv6_cnn() # Load weights from before having 92.55% sparsity- orig_model.load_weights("Conv_6_CIFAR10_Magnitude_Based_Winning_Ticket_Distribution_92.55423622890814.h5") # Create mask using winning ticket- # Use masks to preserve sparsity- # Instantiate a new neural network model for which, the mask is to be created, mask_model = conv6_cnn() # Load weights of PRUNED model- mask_model.set_weights(orig_model.get_weights()) # For each layer, for each weight which is 0, leave it, as is. # And for weights which survive the pruning,reinitialize it to ONE (1)- for wts in mask_model.trainable_weights: wts.assign(tf.where(tf.equal(wts, 0.), 0., 1.)) # User input parameters for Early Stopping in manual implementation- minimum_delta = 0.001 patience = 3 best_val_loss = 100 loc_patience = 0 # Initialize a new LeNet-300-100 model- winning_ticket_model = conv6_cnn() # Load weights of winning ticket- winning_ticket_model.set_weights(orig_model.get_weights()) # Define 'train_one_step()' and 'test_step()' functions here- u/tf.function def train_one_step(model, mask_model, optimizer, x, y): ''' Function to compute one step of gradient descent optimization ''' with tf.GradientTape() as tape: # Make predictions using defined model- y_pred = model(x) # Compute loss- loss = loss_fn(y, y_pred) # Compute gradients wrt defined loss and weights and biases- grads = tape.gradient(loss, model.trainable_variables) # type(grads) # list # List to hold element-wise multiplication between- # computed gradient and masks- grad_mask_mul = [] # Perform element-wise multiplication between computed gradients and masks- for grad_layer, mask in zip(grads, mask_model.trainable_weights): grad_mask_mul.append(tf.math.multiply(grad_layer, mask)) # Apply computed gradients to model's weights and biases- optimizer.apply_gradients(zip(grad_mask_mul, model.trainable_variables)) # Compute accuracy- train_loss(loss) train_accuracy(y, y_pred) return None u/tf.function def test_step(model, optimizer, data, labels): """ Function to test model performance on testing dataset """ predictions = model(data) t_loss = loss_fn(labels, predictions) test_loss(t_loss) test_accuracy(labels, predictions) return None curr_step = 0 for x, y in datagen.flow(X_train, y_train, batch_size = batch_size, shuffle = True): train_one_step(winning_ticket_model, mask_model, optimizer, x, y) # print("current step = ", curr_step) curr_step += 1 if curr_step >= X_train.shape[0] // batch_size: print("\nTerminating training (datagen.flow())") break 
But the following code gives error:

 for x_t, y_t in test_dataset: test_step(winning_ticket_model, optimizer, x_t, y_t) 

> ValueError Traceback (most recent call
> last) in
> 1 for x_t, y_t in test_dataset:
> -- 2 test_step(winning_ticket_model, optimizer, x_t, y_t)
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in __call__(self, *args, **kwds)
> 578 xla_context.Exit()
> 579 else:
> 580 result = self._call(*args, **kwds)
> 581
> 582 if tracing_count == self._get_tracing_count():
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in _call(self, *args, **kwds)
> 625 # This is the first call of __call__, so we have to initialize.
> 626 initializers = []
> 627 self._initialize(args, kwds, add_initializers_to=initializers)
> 628 finally:
> 629 # At this point we know that the initialization is complete (or less
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in _initialize(self, args, kwds, add_initializers_to)
> 504 self._concrete_stateful_fn = (
> 505 self._stateful_fn._get_concrete_function_internal_garbage_collected(
> # pylint: disable=protected-access
> 506 *args, **kwds))
> 507
> 508 def invalid_creator_scope(*unused_args, **unused_kwds):
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagefunction.py
> in _get_concrete_function_internal_garbage_collected(self, *args,
> **kwargs) 2444 args, kwargs = None, None 2445 with self._lock:
> -> 2446 graph_function, _, _ = self._maybe_define_function(args, kwargs) 2447 return graph_function 2448
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagefunction.py
> in _maybe_define_function(self, args, kwargs) 2775 2776
> self._function_cache.missed.add(call_context_key)
> -> 2777 graph_function = self._create_graph_function(args, kwargs) 2778 self._function_cache.primary[cache_key] =
> graph_function 2779 return graph_function, args, kwargs
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagefunction.py
> in _create_graph_function(self, args, kwargs,
> override_flat_arg_shapes) 2665 arg_names=arg_names,
> 2666 override_flat_arg_shapes=override_flat_arg_shapes,
> -> 2667 capture_by_value=self._capture_by_value), 2668 self._function_attributes, 2669 # Tell the ConcreteFunction
> to clean up its graph once it goes out of
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py
> in func_graph_from_py_func(name, python_func, args, kwargs, signature,
> func_graph, autograph, autograph_options, add_control_dependencies,
> arg_names, op_return_value, collections, capture_by_value,
> override_flat_arg_shapes)
> 979 _, original_func = tf_decorator.unwrap(python_func)
> 980
> 981 func_outputs = python_func(*func_args, **func_kwargs)
> 982
> 983 # invariant: `func_outputs` contains only Tensors, CompositeTensors,
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in wrapped_fn(*args, **kwds)
> 439 # __wrapped__ allows AutoGraph to swap in a converted function. We give
> 440 # the function a weak reference to itself to avoid a reference cycle.
> 441 return weak_wrapped_fn().__wrapped__(*args, **kwds)
> 442 weak_wrapped_fn = weakref.ref(wrapped_fn)
> 443
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py
> in wrapper(*args, **kwargs)
> 966 except Exception as e: # pylint:disable=broad-except
> 967 if hasattr(e, "ag_error_metadata"):
> 968 raise e.ag_error_metadata.to_exception(e)
> 969 else:
> 970 raise
>
> ValueError: in user code:
>
> :45 test_step *
> predictions = model(data)
> /home/majumda.local/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py:886
> __call__ **
> self.name)
> /home/majumda.local/lib/python3.7/site-packages/tensorflow/python/keras/engine/input_spec.py:180
> assert_input_compatibility
> str(x.shape.as_list()))
>
> ValueError: Input 0 of layer sequential_7 is incompatible with the layer: expected ndim=4, found ndim=3. Full shape received: [32, 32, 3]
>
> ​


What's the problem?

Thanks!
submitted by grid_world to neuralnetworks [link] [comments]

TensroFlow2.0 CNN validation ValueError

I have a Conv-6 CNN inspired from VGG-19 for CIFAR-10 dataset which I am using with Data Augmentation using tf.Datagen flow() method. The code is as follows-


 # Data preprocessing and cleaning: # input image dimensions img_rows, img_cols = 32, 32 # Load CIFAR-10 dataset- (X_train, y_train), (X_test, y_test) = tf.keras.datasets.cifar10.load_data() print("X_train.shape = {0}, y_train.shape = {1}".format(X_train.shape, y_train.shape)) print("X_test.shape = {0}, y_test.shape = {1}".format(X_test.shape, y_test.shape)) # X_train.shape = (50000, 32, 32, 3), y_train.shape = (50000, 1) # X_test.shape = (10000, 32, 32, 3), y_test.shape = (10000, 1) if tf.keras.backend.image_data_format() == 'channels_first': X_train = X_train.reshape(X_train.shape[0], 3, img_rows, img_cols) X_test = X_test.reshape(X_test.shape[0], 3, img_rows, img_cols) input_shape = (3, img_rows, img_cols) else: X_train = X_train.reshape(X_train.shape[0], img_rows, img_cols, 3) X_test = X_test.reshape(X_test.shape[0], img_rows, img_cols, 3) input_shape = (img_rows, img_cols, 3) print("\n'input_shape' which will be used = {0}\n".format(input_shape)) # 'input_shape' which will be used = (32, 32, 3) # Convert datasets to floating point types- X_train = X_train.astype('float32') X_test = X_test.astype('float32') # Normalize the training and testing datasets- X_train /= 255.0 X_test /= 255.0 # convert class vectors/target to binary class matrices or one-hot encoded values- y_train = tf.keras.utils.to_categorical(y_train, num_classes) y_test = tf.keras.utils.to_categorical(y_test, num_classes) print("\nDimensions of training and testing sets are:") print("X_train.shape = {0}, y_train.shape = {1}".format(X_train.shape, y_train.shape)) print("X_test.shape = {0}, y_test.shape = {1}".format(X_test.shape, y_test.shape)) # Dimensions of training and testing sets are: # X_train.shape = (50000, 32, 32, 3), y_train.shape = (50000, 10) # X_test.shape = (10000, 32, 32, 3), y_test.shape = (10000, 10) train_dataset_features = tf.data.Dataset.from_tensor_slices(X_train) train_dataset_labels = tf.data.Dataset.from_tensor_slices(y_train) test_dataset_features = tf.data.Dataset.from_tensor_slices(X_test) test_dataset_labels = tf.data.Dataset.from_tensor_slices(y_test) # Choose an optimizer and loss function for training- loss_fn = tf.keras.losses.CategoricalCrossentropy() optimizer = tf.keras.optimizers.Adam(lr = 0.0003) # Select metrics to measure the error & accuracy of model. # These metrics accumulate the values over epochs and then # print the overall result- train_loss = tf.keras.metrics.Mean(name = 'train_loss') train_accuracy = tf.keras.metrics.CategoricalAccuracy(name = 'train_accuracy') test_loss = tf.keras.metrics.Mean(name = 'test_loss') test_accuracy = tf.keras.metrics.CategoricalAccuracy(name = 'test_accuracy') # Example of using 'tf.keras.preprocessing.image import ImageDataGenerator class's - flow(x, y)': datagen = ImageDataGenerator( # featurewise_center=True, # featurewise_std_normalization=True, rotation_range = 90, width_shift_range = 0.1, height_shift_range = 0.1, horizontal_flip = True ) def conv6_cnn(): """ Function to define the architecture of a neural network model following Conv-6 architecture for CIFAR-10 dataset and using provided parameter which are used to prune the model. Conv-6 architecture- 64, 64, pool -- convolutional layers 128, 128, pool -- convolutional layers 256, 256, pool -- convolutional layers 256, 256, 10 -- fully connected layers Output: Returns designed and compiled neural network model """ l = tf.keras.layers model = Sequential() model.add( Conv2D( filters = 64, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same', input_shape=(32, 32, 3) ) ) model.add( Conv2D( filters = 64, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( MaxPooling2D( pool_size = (2, 2), strides = (2, 2) ) ) model.add( Conv2D( filters = 128, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( Conv2D( filters = 128, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( MaxPooling2D( pool_size = (2, 2), strides = (2, 2) ) ) model.add( Conv2D( filters = 256, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( Conv2D( filters = 256, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( MaxPooling2D( pool_size = (2, 2), strides = (2, 2) ) ) model.add(Flatten()) model.add( Dense( units = 256, activation='relu', kernel_initializer = tf.initializers.GlorotNormal() ) ) model.add( Dense( units = 256, activation='relu', kernel_initializer = tf.initializers.GlorotNormal() ) ) model.add( Dense( units = 10, activation='softmax' ) ) # Compile pruned CNN- model.compile( loss=tf.keras.losses.categorical_crossentropy, # optimizer='adam', optimizer=tf.keras.optimizers.Adam(lr = 0.0003), metrics=['accuracy'] ) return model # Instantiate a new Conv-2 CNN model- orig_model = conv6_cnn() # Load weights from before having 92.55% sparsity- orig_model.load_weights("Conv_6_CIFAR10_Magnitude_Based_Winning_Ticket_Distribution_92.55423622890814.h5") # Create mask using winning ticket- # Use masks to preserve sparsity- # Instantiate a new neural network model for which, the mask is to be created, mask_model = conv6_cnn() # Load weights of PRUNED model- mask_model.set_weights(orig_model.get_weights()) # For each layer, for each weight which is 0, leave it, as is. # And for weights which survive the pruning,reinitialize it to ONE (1)- for wts in mask_model.trainable_weights: wts.assign(tf.where(tf.equal(wts, 0.), 0., 1.)) # User input parameters for Early Stopping in manual implementation- minimum_delta = 0.001 patience = 3 best_val_loss = 100 loc_patience = 0 # Initialize a new LeNet-300-100 model- winning_ticket_model = conv6_cnn() # Load weights of winning ticket- winning_ticket_model.set_weights(orig_model.get_weights()) # Define 'train_one_step()' and 'test_step()' functions here- u/tf.function def train_one_step(model, mask_model, optimizer, x, y): ''' Function to compute one step of gradient descent optimization ''' with tf.GradientTape() as tape: # Make predictions using defined model- y_pred = model(x) # Compute loss- loss = loss_fn(y, y_pred) # Compute gradients wrt defined loss and weights and biases- grads = tape.gradient(loss, model.trainable_variables) # type(grads) # list # List to hold element-wise multiplication between- # computed gradient and masks- grad_mask_mul = [] # Perform element-wise multiplication between computed gradients and masks- for grad_layer, mask in zip(grads, mask_model.trainable_weights): grad_mask_mul.append(tf.math.multiply(grad_layer, mask)) # Apply computed gradients to model's weights and biases- optimizer.apply_gradients(zip(grad_mask_mul, model.trainable_variables)) # Compute accuracy- train_loss(loss) train_accuracy(y, y_pred) return None u/tf.function def test_step(model, optimizer, data, labels): """ Function to test model performance on testing dataset """ predictions = model(data) t_loss = loss_fn(labels, predictions) test_loss(t_loss) test_accuracy(labels, predictions) return None curr_step = 0 for x, y in datagen.flow(X_train, y_train, batch_size = batch_size, shuffle = True): train_one_step(winning_ticket_model, mask_model, optimizer, x, y) # print("current step = ", curr_step) curr_step += 1 if curr_step >= X_train.shape[0] // batch_size: print("\nTerminating training (datagen.flow())") break 
But the following code gives error:

 for x_t, y_t in test_dataset: test_step(winning_ticket_model, optimizer, x_t, y_t) 

> ValueError Traceback (most recent call
> last) in
> 1 for x_t, y_t in test_dataset:
> -- 2 test_step(winning_ticket_model, optimizer, x_t, y_t)
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in __call__(self, *args, **kwds)
> 578 xla_context.Exit()
> 579 else:
> 580 result = self._call(*args, **kwds)
> 581
> 582 if tracing_count == self._get_tracing_count():
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in _call(self, *args, **kwds)
> 625 # This is the first call of __call__, so we have to initialize.
> 626 initializers = []
> 627 self._initialize(args, kwds, add_initializers_to=initializers)
> 628 finally:
> 629 # At this point we know that the initialization is complete (or less
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in _initialize(self, args, kwds, add_initializers_to)
> 504 self._concrete_stateful_fn = (
> 505 self._stateful_fn._get_concrete_function_internal_garbage_collected(
> # pylint: disable=protected-access
> 506 *args, **kwds))
> 507
> 508 def invalid_creator_scope(*unused_args, **unused_kwds):
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagefunction.py
> in _get_concrete_function_internal_garbage_collected(self, *args,
> **kwargs) 2444 args, kwargs = None, None 2445 with self._lock:
> -> 2446 graph_function, _, _ = self._maybe_define_function(args, kwargs) 2447 return graph_function 2448
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagefunction.py
> in _maybe_define_function(self, args, kwargs) 2775 2776
> self._function_cache.missed.add(call_context_key)
> -> 2777 graph_function = self._create_graph_function(args, kwargs) 2778 self._function_cache.primary[cache_key] =
> graph_function 2779 return graph_function, args, kwargs
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagefunction.py
> in _create_graph_function(self, args, kwargs,
> override_flat_arg_shapes) 2665 arg_names=arg_names,
> 2666 override_flat_arg_shapes=override_flat_arg_shapes,
> -> 2667 capture_by_value=self._capture_by_value), 2668 self._function_attributes, 2669 # Tell the ConcreteFunction
> to clean up its graph once it goes out of
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py
> in func_graph_from_py_func(name, python_func, args, kwargs, signature,
> func_graph, autograph, autograph_options, add_control_dependencies,
> arg_names, op_return_value, collections, capture_by_value,
> override_flat_arg_shapes)
> 979 _, original_func = tf_decorator.unwrap(python_func)
> 980
> 981 func_outputs = python_func(*func_args, **func_kwargs)
> 982
> 983 # invariant: `func_outputs` contains only Tensors, CompositeTensors,
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in wrapped_fn(*args, **kwds)
> 439 # __wrapped__ allows AutoGraph to swap in a converted function. We give
> 440 # the function a weak reference to itself to avoid a reference cycle.
> 441 return weak_wrapped_fn().__wrapped__(*args, **kwds)
> 442 weak_wrapped_fn = weakref.ref(wrapped_fn)
> 443
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py
> in wrapper(*args, **kwargs)
> 966 except Exception as e: # pylint:disable=broad-except
> 967 if hasattr(e, "ag_error_metadata"):
> 968 raise e.ag_error_metadata.to_exception(e)
> 969 else:
> 970 raise
>
> ValueError: in user code:
>
> :45 test_step *
> predictions = model(data)
> /home/majumda.local/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py:886
> __call__ **
> self.name)
> /home/majumda.local/lib/python3.7/site-packages/tensorflow/python/keras/engine/input_spec.py:180
> assert_input_compatibility
> str(x.shape.as_list()))
>
> ValueError: Input 0 of layer sequential_7 is incompatible with the layer: expected ndim=4, found ndim=3. Full shape received: [32, 32, 3]
>
> ​


What's the problem?

Thanks!
submitted by grid_world to tensorflow [link] [comments]

TensroFlow2.0 CNN validation ValueError

I have a Conv-6 CNN inspired from VGG-19 for CIFAR-10 dataset which I am using with Data Augmentation using tf.Datagen flow() method. The code is as follows-


 # Data preprocessing and cleaning: # input image dimensions img_rows, img_cols = 32, 32 # Load CIFAR-10 dataset- (X_train, y_train), (X_test, y_test) = tf.keras.datasets.cifar10.load_data() print("X_train.shape = {0}, y_train.shape = {1}".format(X_train.shape, y_train.shape)) print("X_test.shape = {0}, y_test.shape = {1}".format(X_test.shape, y_test.shape)) # X_train.shape = (50000, 32, 32, 3), y_train.shape = (50000, 1) # X_test.shape = (10000, 32, 32, 3), y_test.shape = (10000, 1) if tf.keras.backend.image_data_format() == 'channels_first': X_train = X_train.reshape(X_train.shape[0], 3, img_rows, img_cols) X_test = X_test.reshape(X_test.shape[0], 3, img_rows, img_cols) input_shape = (3, img_rows, img_cols) else: X_train = X_train.reshape(X_train.shape[0], img_rows, img_cols, 3) X_test = X_test.reshape(X_test.shape[0], img_rows, img_cols, 3) input_shape = (img_rows, img_cols, 3) print("\n'input_shape' which will be used = {0}\n".format(input_shape)) # 'input_shape' which will be used = (32, 32, 3) # Convert datasets to floating point types- X_train = X_train.astype('float32') X_test = X_test.astype('float32') # Normalize the training and testing datasets- X_train /= 255.0 X_test /= 255.0 # convert class vectors/target to binary class matrices or one-hot encoded values- y_train = tf.keras.utils.to_categorical(y_train, num_classes) y_test = tf.keras.utils.to_categorical(y_test, num_classes) print("\nDimensions of training and testing sets are:") print("X_train.shape = {0}, y_train.shape = {1}".format(X_train.shape, y_train.shape)) print("X_test.shape = {0}, y_test.shape = {1}".format(X_test.shape, y_test.shape)) # Dimensions of training and testing sets are: # X_train.shape = (50000, 32, 32, 3), y_train.shape = (50000, 10) # X_test.shape = (10000, 32, 32, 3), y_test.shape = (10000, 10) train_dataset_features = tf.data.Dataset.from_tensor_slices(X_train) train_dataset_labels = tf.data.Dataset.from_tensor_slices(y_train) test_dataset_features = tf.data.Dataset.from_tensor_slices(X_test) test_dataset_labels = tf.data.Dataset.from_tensor_slices(y_test) # Choose an optimizer and loss function for training- loss_fn = tf.keras.losses.CategoricalCrossentropy() optimizer = tf.keras.optimizers.Adam(lr = 0.0003) # Select metrics to measure the error & accuracy of model. # These metrics accumulate the values over epochs and then # print the overall result- train_loss = tf.keras.metrics.Mean(name = 'train_loss') train_accuracy = tf.keras.metrics.CategoricalAccuracy(name = 'train_accuracy') test_loss = tf.keras.metrics.Mean(name = 'test_loss') test_accuracy = tf.keras.metrics.CategoricalAccuracy(name = 'test_accuracy') # Example of using 'tf.keras.preprocessing.image import ImageDataGenerator class's - flow(x, y)': datagen = ImageDataGenerator( # featurewise_center=True, # featurewise_std_normalization=True, rotation_range = 90, width_shift_range = 0.1, height_shift_range = 0.1, horizontal_flip = True ) def conv6_cnn(): """ Function to define the architecture of a neural network model following Conv-6 architecture for CIFAR-10 dataset and using provided parameter which are used to prune the model. Conv-6 architecture- 64, 64, pool -- convolutional layers 128, 128, pool -- convolutional layers 256, 256, pool -- convolutional layers 256, 256, 10 -- fully connected layers Output: Returns designed and compiled neural network model """ l = tf.keras.layers model = Sequential() model.add( Conv2D( filters = 64, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same', input_shape=(32, 32, 3) ) ) model.add( Conv2D( filters = 64, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( MaxPooling2D( pool_size = (2, 2), strides = (2, 2) ) ) model.add( Conv2D( filters = 128, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( Conv2D( filters = 128, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( MaxPooling2D( pool_size = (2, 2), strides = (2, 2) ) ) model.add( Conv2D( filters = 256, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( Conv2D( filters = 256, kernel_size = (3, 3), activation='relu', kernel_initializer = tf.initializers.GlorotNormal(), strides = (1, 1), padding = 'same' ) ) model.add( MaxPooling2D( pool_size = (2, 2), strides = (2, 2) ) ) model.add(Flatten()) model.add( Dense( units = 256, activation='relu', kernel_initializer = tf.initializers.GlorotNormal() ) ) model.add( Dense( units = 256, activation='relu', kernel_initializer = tf.initializers.GlorotNormal() ) ) model.add( Dense( units = 10, activation='softmax' ) ) # Compile pruned CNN- model.compile( loss=tf.keras.losses.categorical_crossentropy, # optimizer='adam', optimizer=tf.keras.optimizers.Adam(lr = 0.0003), metrics=['accuracy'] ) return model # Instantiate a new Conv-2 CNN model- orig_model = conv6_cnn() # Load weights from before having 92.55% sparsity- orig_model.load_weights("Conv_6_CIFAR10_Magnitude_Based_Winning_Ticket_Distribution_92.55423622890814.h5") # Create mask using winning ticket- # Use masks to preserve sparsity- # Instantiate a new neural network model for which, the mask is to be created, mask_model = conv6_cnn() # Load weights of PRUNED model- mask_model.set_weights(orig_model.get_weights()) # For each layer, for each weight which is 0, leave it, as is. # And for weights which survive the pruning,reinitialize it to ONE (1)- for wts in mask_model.trainable_weights: wts.assign(tf.where(tf.equal(wts, 0.), 0., 1.)) # User input parameters for Early Stopping in manual implementation- minimum_delta = 0.001 patience = 3 best_val_loss = 100 loc_patience = 0 # Initialize a new LeNet-300-100 model- winning_ticket_model = conv6_cnn() # Load weights of winning ticket- winning_ticket_model.set_weights(orig_model.get_weights()) # Define 'train_one_step()' and 'test_step()' functions here- u/tf.function def train_one_step(model, mask_model, optimizer, x, y): ''' Function to compute one step of gradient descent optimization ''' with tf.GradientTape() as tape: # Make predictions using defined model- y_pred = model(x) # Compute loss- loss = loss_fn(y, y_pred) # Compute gradients wrt defined loss and weights and biases- grads = tape.gradient(loss, model.trainable_variables) # type(grads) # list # List to hold element-wise multiplication between- # computed gradient and masks- grad_mask_mul = [] # Perform element-wise multiplication between computed gradients and masks- for grad_layer, mask in zip(grads, mask_model.trainable_weights): grad_mask_mul.append(tf.math.multiply(grad_layer, mask)) # Apply computed gradients to model's weights and biases- optimizer.apply_gradients(zip(grad_mask_mul, model.trainable_variables)) # Compute accuracy- train_loss(loss) train_accuracy(y, y_pred) return None u/tf.function def test_step(model, optimizer, data, labels): """ Function to test model performance on testing dataset """ predictions = model(data) t_loss = loss_fn(labels, predictions) test_loss(t_loss) test_accuracy(labels, predictions) return None curr_step = 0 for x, y in datagen.flow(X_train, y_train, batch_size = batch_size, shuffle = True): train_one_step(winning_ticket_model, mask_model, optimizer, x, y) # print("current step = ", curr_step) curr_step += 1 if curr_step >= X_train.shape[0] // batch_size: print("\nTerminating training (datagen.flow())") break 
But the following code gives error:

 for x_t, y_t in test_dataset: test_step(winning_ticket_model, optimizer, x_t, y_t) 

> ValueError Traceback (most recent call
> last) in
> 1 for x_t, y_t in test_dataset:
> -- 2 test_step(winning_ticket_model, optimizer, x_t, y_t)
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in __call__(self, *args, **kwds)
> 578 xla_context.Exit()
> 579 else:
> 580 result = self._call(*args, **kwds)
> 581
> 582 if tracing_count == self._get_tracing_count():
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in _call(self, *args, **kwds)
> 625 # This is the first call of __call__, so we have to initialize.
> 626 initializers = []
> 627 self._initialize(args, kwds, add_initializers_to=initializers)
> 628 finally:
> 629 # At this point we know that the initialization is complete (or less
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in _initialize(self, args, kwds, add_initializers_to)
> 504 self._concrete_stateful_fn = (
> 505 self._stateful_fn._get_concrete_function_internal_garbage_collected(
> # pylint: disable=protected-access
> 506 *args, **kwds))
> 507
> 508 def invalid_creator_scope(*unused_args, **unused_kwds):
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagefunction.py
> in _get_concrete_function_internal_garbage_collected(self, *args,
> **kwargs) 2444 args, kwargs = None, None 2445 with self._lock:
> -> 2446 graph_function, _, _ = self._maybe_define_function(args, kwargs) 2447 return graph_function 2448
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagefunction.py
> in _maybe_define_function(self, args, kwargs) 2775 2776
> self._function_cache.missed.add(call_context_key)
> -> 2777 graph_function = self._create_graph_function(args, kwargs) 2778 self._function_cache.primary[cache_key] =
> graph_function 2779 return graph_function, args, kwargs
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagefunction.py
> in _create_graph_function(self, args, kwargs,
> override_flat_arg_shapes) 2665 arg_names=arg_names,
> 2666 override_flat_arg_shapes=override_flat_arg_shapes,
> -> 2667 capture_by_value=self._capture_by_value), 2668 self._function_attributes, 2669 # Tell the ConcreteFunction
> to clean up its graph once it goes out of
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py
> in func_graph_from_py_func(name, python_func, args, kwargs, signature,
> func_graph, autograph, autograph_options, add_control_dependencies,
> arg_names, op_return_value, collections, capture_by_value,
> override_flat_arg_shapes)
> 979 _, original_func = tf_decorator.unwrap(python_func)
> 980
> 981 func_outputs = python_func(*func_args, **func_kwargs)
> 982
> 983 # invariant: `func_outputs` contains only Tensors, CompositeTensors,
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/eagedef_function.py
> in wrapped_fn(*args, **kwds)
> 439 # __wrapped__ allows AutoGraph to swap in a converted function. We give
> 440 # the function a weak reference to itself to avoid a reference cycle.
> 441 return weak_wrapped_fn().__wrapped__(*args, **kwds)
> 442 weak_wrapped_fn = weakref.ref(wrapped_fn)
> 443
>
> ~/.local/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py
> in wrapper(*args, **kwargs)
> 966 except Exception as e: # pylint:disable=broad-except
> 967 if hasattr(e, "ag_error_metadata"):
> 968 raise e.ag_error_metadata.to_exception(e)
> 969 else:
> 970 raise
>
> ValueError: in user code:
>
> :45 test_step *
> predictions = model(data)
> /home/majumda.local/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py:886
> __call__ **
> self.name)
> /home/majumda.local/lib/python3.7/site-packages/tensorflow/python/keras/engine/input_spec.py:180
> assert_input_compatibility
> str(x.shape.as_list()))
>
> ValueError: Input 0 of layer sequential_7 is incompatible with the layer: expected ndim=4, found ndim=3. Full shape received: [32, 32, 3]
>
> ​


What's the problem?

Thanks!
submitted by grid_world to learnmachinelearning [link] [comments]

Wall Street Week Ahead for the trading week beginning December 9th, 2019

Good Saturday morning to all of you here on wallstreetbets. I hope everyone on this sub made out pretty nicely in the market this past week, and is ready for the new trading week ahead.
Here is everything you need to know to get you ready for the trading week beginning December 9th, 2019.

What Trump does before trade deadline is the ‘wild card’ that will drive markets in the week ahead - (Source)

The Trump administration’s Dec. 15 deadline for new tariffs on China looms large, and while most strategists expect them to be delayed while talks continue, they don’t rule out the unexpected.
“That’s the biggest thing in the room next week. I don’t think he’s going to raise them. I think they’ll find a reason,” said James Pauslen, chief investment strategist at Leuthold Group. But Paulsen said President Donald Trump’s unpredictable nature makes it really impossible to tell what will happen as the deadline nears.
“He’s the one off you’re never sure about. It’s not just tariffs. It could be damn near anything,” Paulsen said. “I think he goes out of his way to be a wild card.”
Just in the past week, Trump said he would put new tariffs on Brazil, Argentina and France. He rattled markets when he said he could wait until after the election for a trade deal with China.
Once dubbing himself “tariff man,” Trump reminded markets that he sees tariffs as a way of getting what he wants from an opponent, and traders were reminded tariffs may be around for a long time.
Trade certainly could be the most important event for markets in the week ahead, which also includes a Fed interest rate decision Wednesday and the U.K.’s election that could set the course for Brexit. If there’s no China deal, that could beat up stocks, send Treasury yields lower and send investors into other safe havens.
When Fed officials meet this week, they are not expected to change interest rates, but they are likely to discuss whether they believe their repo operations to drive liquidity in the short-term funding market are running smoothly, ahead of year end. Economic reports in the coming week include CPI inflation Wednesday, which could be an important input for the Fed.
Punt, but no deal As of Friday, the White House did not appear any closer to striking a deal with China, though officials say talks are going fine. Back in August, Trump said if there is no deal, Dec. 15 is the date for a new wave of tariffs on $156 billion in Chinese goods, including cell phones, toys and lap top computers.
Dan Clifton, head of policy research at Strategas, said it seems like a low probability there will be a deal in the coming week. “What the market is focused on right now is whether there’s going to be tariffs that to into effect on Dec. 15, or not. It’s being rated pretty binary,” said Clifton. “I think what’s happening here and the actions by China overnight looks like we’re setting up for a kick.”
China removed some tariffs from U.S. agricultural products Friday, and administration officials have been talking about discussions going fine.
Clifton said if tariffs are put on hold, it’s unclear for how long. “Those are going to be larger questions that have to be answered. This is really now about politics. Is it a better idea for the president to cut a deal without major structural reforms, or should he walk away? That’s the larger debate that has to happen after Dec. 15,” Clifton said. “I’m getting worried that some in the administration... they’re leaning toward no deal category.”
Clifton said Trump’s approval rating falls when the trade wars heat up, so that may motivate him to complete the deal with China even if he doesn’t get everything he wants.
Michael Schumacher, director of rates strategy at Wells Fargo, said his base case is for a trade deal to be signed in the next couple of months, but even so, he said he can’t entirely rule out another outcome. It would make sense for tariffs to be put on hold while talks continue.
“The tweeter-in-chief controls that one, ” said Schumacher. “That’s anybody’s guess...I wouldn’t be at all surprised if he suspends it for a few weeks. If he doesn’t, that’s a pretty unpleasant result. That’s risk off. That’s pretty clear.”
Because the next group of tariffs would be on consumer goods, economists fear they could hit the economy through the consumer, the strongest and largest engine behind economic growth.
Fed ahead The Fed has moved to the sidelines and says it is monitoring economic data before deciding its next move. Friday’s strong November jobs report, with 266,000 jobs added, reinforces the Fed’s decision to move to neutral for now.
So the most important headlines from its meeting this week could be about the repo market, basically the plumbing for the financial system where financial institutions fund themselves. Interest rates in that somewhat obscure market spiked in September. Market pros said the issue was a cash crunch in the short term lending market, made better when the Fed started repo operations.
The Fed now has multiple operations running over year end, and Schumacher said it has latitude to do more. Strategists expect there to be more pressure on the repo market as banks rein in operations to spruce up their balance sheets at year end.
“No one is going to come to the Fed and say you did too much in the year-end funding,” said Schumacher. “If repo happens to spike somewhat on one day, the Fed is going to hammer it the next day.”
Paulsen said the markets will be attuned to this week’s inflation numbers. Consumer inflation, the CPI is reported on Wednesday and producer prices are Thursday.
A pickup in inflation of any significance is one thing that could pull the Fed from the sidelines, and prod it to consider a rate hike.
“I think the inflation reports might start to get a little attention. Given the jobs numbers, the employment rate, growth picking up a little bit and a better tone in manufacturing. I do think if you get some hot CPI number, I don’t know if the Fed can ignore it,” he said. “Core CPI is 2.3%.” He said it would get noticed if it jumped to 2.5% or better.
The Fed’s inflation target is 2% but its preferred measure is the PCE inflation, and that remains under 2%.
Stocks were sharply higher Friday but ended the past week flattish. The S&P 500 was slightly higher, up 0.2% at 3,145, and the Dow was down 0.1% at 28,015. The Nasdaq was 0.1% lower, ending the week at 8,656.

This past week saw the following moves in the S&P:

(CLICK HERE FOR THE FULL S&P TREE MAP FOR THE PAST WEEK!)

Major Indices for this past week:

(CLICK HERE FOR THE MAJOR INDICES FOR THE PAST WEEK!)

Major Futures Markets as of Friday's close:

(CLICK HERE FOR THE MAJOR FUTURES INDICES AS OF FRIDAY!)

Economic Calendar for the Week Ahead:

(CLICK HERE FOR THE FULL ECONOMIC CALENDAR FOR THE WEEK AHEAD!)

Sector Performance WTD, MTD, YTD:

(CLICK HERE FOR FRIDAY'S PERFORMANCE!)
(CLICK HERE FOR THE WEEK-TO-DATE PERFORMANCE!)
(CLICK HERE FOR THE MONTH-TO-DATE PERFORMANCE!)
(CLICK HERE FOR THE 3-MONTH PERFORMANCE!)
(CLICK HERE FOR THE YEAR-TO-DATE PERFORMANCE!)
(CLICK HERE FOR THE 52-WEEK PERFORMANCE!)

Percentage Changes for the Major Indices, WTD, MTD, QTD, YTD as of Friday's close:

(CLICK HERE FOR THE CHART!)

S&P Sectors for the Past Week:

(CLICK HERE FOR THE CHART!)

Major Indices Pullback/Correction Levels as of Friday's close:

(CLICK HERE FOR THE CHART!

Major Indices Rally Levels as of Friday's close:

(CLICK HERE FOR THE CHART!)

Most Anticipated Earnings Releases for this week:

(CLICK HERE FOR THE CHART!)

Here are the upcoming IPO's for this week:

(CLICK HERE FOR THE CHART!)

Friday's Stock Analyst Upgrades & Downgrades:

(CLICK HERE FOR THE CHART LINK #1!)
(CLICK HERE FOR THE CHART LINK #2!)

Reasons We Still Believe In December

It has been a rough start to the most wonderful month of them all, with the S&P 500 Index down each of the first two days of December. Don’t stop believing just yet, though.
Everyone knows December has usually been a good month for stocks, but what happened last year is still fresh in the minds of many investors. The S&P 500 fell 9.1% in December 2018 for the worst December since 1931. That sounds really bad, until you realize stocks fell 30% in September 1931, but we digress.
One major difference between now and last year is how well the global equities have been performing. Heading into December 2018, the S&P 500 was up 3.2% year to date, but markets outside of the United States were already firmly in the red, with many down double digits.
“We don’t think stocks are on the verge of another massive December sell off,” said LPL Financial Senior Market Strategist Ryan Detrick. “If my Cincinnati Bengals can win a game, anything is possible. However, we are quite encouraged by the overall participation we are seeing from various global stock markets this year versus last year, when the United States was about the only market in the green heading into December.”
Stocks have also overcome volatile starts to December recently. The S&P 500 was down four days in a row to start 2013 and 2017, but the gauge still managed to gain 2.4% and 1%, respectively, in those years.
As the LPL Chart of the Day shows, December has been the second-best month of the year for stocks going back to 1950. It is worth noting that it was the best month of the year before last year’s massive drop. Stocks have historically been strong in pre-election years as well, and December has never been lower two times in a row during a pre-election year. Given stocks fell in December 2015, bulls could be smiling when this month is wrapped up.
(CLICK HERE FOR THE CHART!)

Could Impeachment Be Good for Investors?

Impeaching a President with the possibility of removal from office is by no means great for the country. However, it may not be so horrible for the stock market or investors if history is any guide. We first touched on this over two years ago here on the blog and now that much has transpired and the US House of Representatives is now proceeding with drafting articles of impeachment we figured it was a good time to revisit the history (albeit limited) of market behavior during presidential impeachment proceedings. The three charts below really tell the story.
During the Watergate scandal of Nixon’s second term the market suffered a major bear market from January 1973 to OctobeDecember 1974 with the Dow down 45.1%, S&P 500 down 48.2% and NASDAQ down 59.9%. Sure there were other factors that contributed to the bear market such as the Oil Embargo, Arab-Israeli War, collapse of the Bretton Woods system, high inflation and Watergate. However, shortly after Nixon resigned on August 9, 1974 the market reached the secular bear market low on October 3 for S&P and NASDAQ and December 6 for the Dow.
Leading up to the Clinton investigations and through his subsequent impeachment and the acquittal by the Senate the market was on a tear as one of the biggest bull markets in history raged on. After the 1994 midterm elections when the Republicans took back control of both houses of Congress the market remained on a 45 degree upward trajectory except for a few blips and the shortest bear market on record that lasted 45 days and bottomed on August 31, 1998.
Clinton was impeached in December 1998 and acquitted in February 1999 as the market continued higher throughout his second term. Sure there were other factors that contributed to the late-1990s bull-run such as the Dotcom Boom, the Information Revolution, millennial fervor and a booming global economy, but Clinton’s personal scandal had little negative impact on markets.
It remains to be seen of course what will happen with President Trump’s impeachment proceeding and how the world and markets react, but the market continues to march on. If the limited history of impeachment proceedings of a US President in modern times (no offense to our 17th President Andrew Johnson) is any guide, the market has bounced back after the last two impeachment proceedings and was higher a year later. Perhaps it will be better to buy any impeachment dip rather than sell it.
(CLICK HERE FOR THE CHART LINK #1!)
(CLICK HERE FOR THE CHART LINK #2!!)
(CLICK HERE FOR THE CHART LINK #3!!)

Typical December Trading: Modest Strength Early, Choppy Middle and Solid Gains Late

Historically, the first trading day of December, today, has a slightly bearish bias with S&P 500 advancing 34 times over the last 69 years (since 1950) with an average loss of 0.02%. Tomorrow, the second trading day of December however, has been stronger, up 52.2% of the time since 1950 with an average gain of 0.08% and the third day is better still, up 59.4% of the time.
Over the more recent 21-year period, December has opened with strength and gains over its first seven trading days before beginning to drift. By mid-month all five indices have surrendered any early-month gains, but shortly thereafter Santa usually visits sending the market higher until the last day of the month and the year when last minute selling, most likely for tax reasons, briefly interrupts the market’s rally.
(CLICK HERE FOR THE CHART!)

Odds Still Favor A Gain for Rest of December Despite Rough Start

Just when it was beginning to look like trade was heading in a positive direction, the wind changed direction again. Yesterday it was steel and aluminum tariffs on Brazil and Argentina and today a deal with China may not happen as soon as previously anticipated. The result was the worst first two trading days of December since last year and the sixth worst start since 1950 for S&P 500. DJIA and NASDAQ are eighth worst since 1950 and 1971, respectively.
However, historically past weakness in early December (losses over the first two trading days combined) were still followed by average gains for the remainder of the month the majority of the time. DJIA has advanced 74.19% of the time following losses over the first two trading days with an average gain for the remainder of December of 1.39%. S&P 500 was up 67.65% of the time with an average rest of month gain of 0.84%. NASDAQ is modestly softer advancing 61.11% of the time during the remainder of December with an average advance of 0.30%.
(CLICK HERE FOR THE CHART LINK #1!)
(CLICK HERE FOR THE CHART LINK #2!)
(CLICK HERE FOR THE CHART LINK #3!)

STOCK MARKET VIDEO: Stock Market Analysis Video for Week Ending December 6th, 2019

(CLICK HERE FOR THE YOUTUBE VIDEO!)

STOCK MARKET VIDEO: ShadowTrader Video Weekly 12.8.19

([CLICK HERE FOR THE YOUTUBE VIDEO!]())
(VIDEO NOT YET POSTED!)
Here are the most notable companies (tickers) reporting earnings in this upcoming trading week ahead-
  • $LULU
  • $COST
  • $THO
  • $AZO
  • $ADBE
  • $AVGO
  • $CIEN
  • $MDB
  • $CHWY
  • $SFIX
  • $AEO
  • $GME
  • $OLLI
  • $TOL
  • $PLCE
  • $UNFI
  • $PLAY
  • $ORCL
  • $HDS
  • $CONN
  • $MTN
  • $JT
  • $LOVE
  • $CMD
  • $PLAB
  • $DBI
  • $ROAD
  • $VRA
  • $CDMO
  • $LQDT
  • $TLRD
  • $TWST
  • $PHR
  • $NDSN
  • $MESA
  • $VERU
  • $DLHC
  • $BLBD
  • $OXM
  • $NX
  • $GNSS
  • $PHX
  • $GTIM
(CLICK HERE FOR NEXT WEEK'S MOST NOTABLE EARNINGS RELEASES!)
(CLICK HERE FOR NEXT WEEK'S HIGHEST VOLATILITY EARNINGS RELEASES!)
(CLICK HERE FOR MOST ANTICIPATED EARNINGS RELEASES FOR THE NEXT 5 WEEKS!)
Below are some of the notable companies coming out with earnings releases this upcoming trading week ahead which includes the date/time of release & consensus estimates courtesy of Earnings Whispers:

Monday 12.9.19 Before Market Open:

(CLICK HERE FOR MONDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Monday 12.9.19 After Market Close:

(CLICK HERE FOR MONDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Tuesday 12.10.19 Before Market Open:

(CLICK HERE FOR TUESDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Tuesday 12.10.19 After Market Close:

(CLICK HERE FOR TUESDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Wednesday 12.11.19 Before Market Open:

(CLICK HERE FOR WEDNESDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Wednesday 12.11.19 After Market Close:

(CLICK HERE FOR WEDNESDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Thursday 12.12.19 Before Market Open:

(CLICK HERE FOR THURSDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!)

Thursday 12.12.19 After Market Close:

(CLICK HERE FOR THURSDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!)

Friday 12.13.19 Before Market Open:

([CLICK HERE FOR FRIDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!]())
NONE.

Friday 12.13.19 After Market Close:

([CLICK HERE FOR FRIDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!]())
NONE.

lululemon athletica inc. $229.38

lululemon athletica inc. (LULU) is confirmed to report earnings at approximately 4:05 PM ET on Wednesday, December 11, 2019. The consensus earnings estimate is $0.93 per share on revenue of $896.50 million and the Earnings Whisper ® number is $0.98 per share. Investor sentiment going into the company's earnings release has 73% expecting an earnings beat The company's guidance was for earnings of $0.90 to $0.92 per share on revenue of $880.00 million to $890.00 million. Consensus estimates are for year-over-year earnings growth of 24.00% with revenue increasing by 19.91%. Short interest has increased by 9.8% since the company's last earnings release while the stock has drifted higher by 16.0% from its open following the earnings release to be 26.0% above its 200 day moving average of $182.08. Overall earnings estimates have been revised higher since the company's last earnings release. On Friday, December 6, 2019 there was some notable buying of 927 contracts of the $260.00 call expiring on Friday, December 13, 2019. Option traders are pricing in a 8.3% move on earnings and the stock has averaged a 11.1% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Costco Wholesale Corp. $294.95

Costco Wholesale Corp. (COST) is confirmed to report earnings at approximately 4:15 PM ET on Thursday, December 12, 2019. The consensus earnings estimate is $1.70 per share on revenue of $37.43 billion and the Earnings Whisper ® number is $1.74 per share. Investor sentiment going into the company's earnings release has 78% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 5.59% with revenue increasing by 6.73%. Short interest has increased by 19.3% since the company's last earnings release while the stock has drifted higher by 2.5% from its open following the earnings release to be 10.3% above its 200 day moving average of $267.50. Overall earnings estimates have been revised higher since the company's last earnings release. On Tuesday, November 19, 2019 there was some notable buying of 916 contracts of the $265.00 put expiring on Friday, December 27, 2019. Option traders are pricing in a 3.7% move on earnings and the stock has averaged a 3.6% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Thor Industries, Inc. $67.77

Thor Industries, Inc. (THO) is confirmed to report earnings at approximately 6:45 AM ET on Monday, December 9, 2019. The consensus earnings estimate is $1.23 per share on revenue of $2.30 billion and the Earnings Whisper ® number is $1.30 per share. Investor sentiment going into the company's earnings release has 69% expecting an earnings beat. Consensus estimates are for earnings to decline year-over-year by 16.89% with revenue increasing by 30.98%. Short interest has increased by 48.1% since the company's last earnings release while the stock has drifted higher by 25.5% from its open following the earnings release to be 16.0% above its 200 day moving average of $58.44. Overall earnings estimates have been revised lower since the company's last earnings release. On Tuesday, December 3, 2019 there was some notable buying of 838 contracts of the $60.00 put expiring on Friday, December 20, 2019. Option traders are pricing in a 10.0% move on earnings and the stock has averaged a 7.6% move in recent quarters.

(CLICK HERE FOR THE CHART!)

AutoZone, Inc. -

AutoZone, Inc. (AZO) is confirmed to report earnings at approximately 6:55 AM ET on Tuesday, December 10, 2019. The consensus earnings estimate is $13.69 per share on revenue of $2.76 billion and the Earnings Whisper ® number is $14.02 per share. Investor sentiment going into the company's earnings release has 76% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 1.63% with revenue increasing by 4.48%. Short interest has decreased by 13.7% since the company's last earnings release while the stock has drifted higher by 1.1% from its open following the earnings release to be 8.9% above its 200 day moving average of $1,077.00. Overall earnings estimates have been revised lower since the company's last earnings release. Option traders are pricing in a 5.5% move on earnings and the stock has averaged a 5.6% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Adobe Inc. $306.23

Adobe Inc. (ADBE) is confirmed to report earnings at approximately 4:05 PM ET on Thursday, December 12, 2019. The consensus earnings estimate is $2.26 per share on revenue of $2.97 billion and the Earnings Whisper ® number is $2.30 per share. Investor sentiment going into the company's earnings release has 74% expecting an earnings beat The company's guidance was for earnings of approximately $2.25 per share. Consensus estimates are for year-over-year earnings growth of 23.50% with revenue increasing by 20.51%. Short interest has increased by 44.6% since the company's last earnings release while the stock has drifted higher by 11.2% from its open following the earnings release to be 9.1% above its 200 day moving average of $280.60. Overall earnings estimates have been revised higher since the company's last earnings release. On Monday, November 25, 2019 there was some notable buying of 505 contracts of the $340.00 call expiring on Friday, December 20, 2019. Option traders are pricing in a 3.9% move on earnings and the stock has averaged a 3.8% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Broadcom Limited $316.05

Broadcom Limited (AVGO) is confirmed to report earnings at approximately 4:15 PM ET on Thursday, December 12, 2019. The consensus earnings estimate is $5.36 per share on revenue of $5.76 billion and the Earnings Whisper ® number is $5.47 per share. Investor sentiment going into the company's earnings release has 69% expecting an earnings beat. Consensus estimates are for earnings to decline year-over-year by 7.27% with revenue increasing by 5.80%. Short interest has increased by 22.8% since the company's last earnings release while the stock has drifted higher by 6.2% from its open following the earnings release to be 9.7% above its 200 day moving average of $288.21. Overall earnings estimates have been revised lower since the company's last earnings release. On Thursday, December 5, 2019 there was some notable buying of 625 contracts of the $135.00 call expiring on Friday, January 15, 2021. Option traders are pricing in a 5.2% move on earnings and the stock has averaged a 4.7% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Ciena Corporation $35.00

Ciena Corporation (CIEN) is confirmed to report earnings at approximately 7:00 AM ET on Thursday, December 12, 2019. The consensus earnings estimate is $0.66 per share on revenue of $964.80 million and the Earnings Whisper ® number is $0.67 per share. Investor sentiment going into the company's earnings release has 72% expecting an earnings beat The company's guidance was for revenue of $945.00 million to $975.00 million. Consensus estimates are for year-over-year earnings growth of 26.92% with revenue increasing by 7.28%. Short interest has increased by 66.6% since the company's last earnings release while the stock has drifted lower by 9.5% from its open following the earnings release to be 11.0% below its 200 day moving average of $39.32. Overall earnings estimates have been revised higher since the company's last earnings release. On Friday, December 6, 2019 there was some notable buying of 1,156 contracts of the $36.00 put expiring on Friday, December 13, 2019. Option traders are pricing in a 9.0% move on earnings and the stock has averaged a 10.1% move in recent quarters.

(CLICK HERE FOR THE CHART!)

MongoDB, Inc. $131.17

MongoDB, Inc. (MDB) is confirmed to report earnings at approximately 4:05 PM ET on Monday, December 9, 2019. The consensus estimate is for a loss of $0.28 per share on revenue of $99.73 million and the Earnings Whisper ® number is ($0.26) per share. Investor sentiment going into the company's earnings release has 63% expecting an earnings beat The company's guidance was for a loss of $0.29 to $0.27 per share on revenue of $98.00 million to $100.00 million. Consensus estimates are for year-over-year earnings growth of 15.15% with revenue increasing by 53.47%. Short interest has increased by 15.2% since the company's last earnings release while the stock has drifted lower by 16.3% from its open following the earnings release to be 5.1% below its 200 day moving average of $138.19. Overall earnings estimates have been revised lower since the company's last earnings release. On Tuesday, November 19, 2019 there was some notable buying of 970 contracts of the $210.00 call expiring on Friday, December 20, 2019. Option traders are pricing in a 10.1% move on earnings and the stock has averaged a 8.7% move in recent quarters.

(CLICK HERE FOR THE CHART!)

Chewy, Inc. $24.95

Chewy, Inc. (CHWY) is confirmed to report earnings at approximately 4:10 PM ET on Monday, December 9, 2019. The consensus estimate is for a loss of $0.16 per share on revenue of $1.21 billion and the Earnings Whisper ® number is ($0.15) per share. Investor sentiment going into the company's earnings release has 57% expecting an earnings beat. Short interest has increased by 40.7% since the company's last earnings release while the stock has drifted lower by 14.6% from its open following the earnings release. Overall earnings estimates have been revised lower since the company's last earnings release. The stock has averaged a 6.4% move on earnings in recent quarters.

(CLICK HERE FOR THE CHART!)

Stitch Fix, Inc. $24.09

Stitch Fix, Inc. (SFIX) is confirmed to report earnings at approximately 4:05 PM ET on Monday, December 9, 2019. The consensus estimate is for a loss of $0.06 per share on revenue of $441.04 million and the Earnings Whisper ® number is ($0.04) per share. Investor sentiment going into the company's earnings release has 69% expecting an earnings beat The company's guidance was for revenue of $438.00 million to $442.00 million. Consensus estimates are for earnings to decline year-over-year by 160.00% with revenue increasing by 20.43%. Short interest has increased by 30.9% since the company's last earnings release while the stock has drifted higher by 41.7% from its open following the earnings release to be 2.4% below its 200 day moving average of $24.69. Overall earnings estimates have been revised lower since the company's last earnings release. On Thursday, November 21, 2019 there was some notable buying of 1,000 contracts of the $13.00 put expiring on Friday, January 17, 2020. Option traders are pricing in a 20.0% move on earnings and the stock has averaged a 18.9% move in recent quarters.

(CLICK HERE FOR THE CHART!)

DISCUSS!

What are you all watching for in this upcoming trading week?
I hope you all have a wonderful weekend and a great trading week ahead wallstreetbets.
submitted by bigbear0083 to wallstreetbets [link] [comments]

Binary Options Strategy 2020  100% WIN GUARANTEED ... $625+ Profit in 30 Minutes Trading Binary Options On NADEX ... BINARY OPTIONS SIGNALS: BINARY OPTIONS STRATEGY - TRADING ... 10$ to 124$ in 5 minutes - IQ Option Live Trades Starting ... BINARY OPTIONS STRATEGY - 80% WINS  500$ in 10 minutes ... Binary Options for Beginners - $10 to $3,500 - Newest Method 2019 CANDLESTICK PSYCHOLOGY - PRICE ACTION - IQ OPTION Binary Options: $5 to $4300 in 10 Minutes  IQ Option ... Trading in IQ Option with EMA and RSI Indicators in 10 Minutes 10 Dumbest Mistakes As Beginner in Binary Options/Forex ...

Binary option candlestick strategy 1 minute snr; गोरखपुर ; उत्तराखंड. Opções binárias estratégia 60 cruzamento de médias; दिल्ली/NCR; Estudando candlestiks em opções binárias; Systems and indicators for binary options; हरियाणा; हिमाचल प्रदेश; Tips binary option beginner; Be nhin bien tap doc lop 2; मध Trade stocks, ETFs, forex & Digital Options at IQ Option, one of the fastest growing online trading platforms. Sign up today and be a part of 17 million user base at IQ Option. Download our award-winning free online binary options trading software! Practice with a free demo account! Voted #1 in 28 Countries with 24/7 support! Trade; For Traders; About Us; en. Русский English 中文 ... Binary Option prices can move significantly even when the underlying market has very low volatility, creating multiple trading opportunities even in quiet markets. Despite their potential volatility, binary contracts are designed to limit the risk to traders. There is a strict cap on the worst-case loss on any contract – each contract must settle at either $0 or $100. This offers the best of ... 5BB NADEX 5 Minute Binary Options Expiration System for Daily Cash Flow Generation.... 5BB gives you an edge that is needed in the five-minute NADEX binary opti Binary Spy $ 8.00; Binary Profit Sniper $ 9.50; Copy Binary Trades $ 6.00; Easy Profit Binary Option by Kishore M $ 8.00; Binary Genetic $ 10.00; Binary Options: Strategies for Directional and Volatility Trading $ 6.50; Binary Killer $ 6.00; Binary Bullion Bot $ 6.00; Binary Options Buddy $ 11.00; Binary Cash Bot $ 6.00; Binary Auto Profits $ 5 ... Binary.com Binary.com is an award-winning online trading provider that helps its clients to trade on financial markets through binary options and CFDs. Trading binary options and CFDs on Synthetic Indices is classified as a gambling activity. Remember that gambling can be addictive – please play responsibly. Learn more about Responsible Trading. Some products are not available in all countries. This ... USD/JPY Forecast: Firmer advance once above 106.25 Read more on https://www.fxstreet.com Source: www.fxstreet.com Binary Option prices can move significantly even when the underlying market has very low volatility, creating multiple trading opportunities even in quiet markets. Despite their potential volatility, binary contracts are designed to limit the risk to traders. There is a strict cap on the worst-case loss on any contract – each contract must settle at either $0 or $100. This offers the best of ... We have a very simple process, with the goal of making you money. You are sent exact signals that tell you the direction of the trade, the asset, and the expiry time to set. You can receive the signals on your PC, MAC, Phone or E-mail. Once you receive the Binary Strategy signal, you place the trade manually, and cash in. The image on the right shows you an example signal.

[index] [20611] [16680] [8034] [3105] [26569] [6320] [16967] [18605] [2240] [10731]

Binary Options Strategy 2020 100% WIN GUARANTEED ...

A beginner or professional looking on how to trade online in 2018. And looking for a highly regulated broker to trade binary, forex, bitcoin, cryptocurrencie... Binary Options for Beginners - $10 to $3,500 - Newest Method 2019 Do not miss! DEMO ACCOUNT: https://bit.ly/2Lq3NUt I want to kindly ask you to subscribe my channel, of course if you like my ... iq optionyou can join Olymptrade trading ( bonus 100%)- https://bit.ly/307sNI4IQ Option is a leading financial trading platform. IQ option is so easy to handle... 10 Bumbest Mistakes as Beginner in Binary Options: 1. I almost quit 2. Not getting EDUCATION 3. Investing TOO early 4. Not being disciplined 5. Chose the wro... BINARY OPTIONS STRATEGY - 80% WINS 500$ in 10 minutes♛ POCKET OPTION - http://pocketopttion.com♛ BINOMO - https://qps.ru/2lUo5♛ TO RECEIVE BINARY OPTIONS SI... You could earn $500-$1,000 a day with my signals and strategies!!! All of my Strategies, Signals, and Trainings are on sale Right Now!!! Visit www.MyGoldenSi... IQ OPTION STRATEGY 2020: 🔥 Candlesticks Analysis 🔥 Price Action Strategy 🔥 Binary Options Strategy 🔥 - Duration: 12:49. Noah Trading 2,210 views 12:49 Strategy Binary Option with EMA (6) RSI (14, 80, 25). This strategy can be used in Olymp Trade, IQ Option, Expert Option, etc. Just follow the trend of EMA, and you will profit. Binary Options Signals: Binary Options Strategy - Trading Options ★ TRY FOR FREE http://binares.com/bonus - [Free register on binary options] ★ REAL REGIST... The road to success through trading IQ option Best Bot Reviews Iq Option 2020 ,We make videos using this softwhere bot which aims to make it easier for you t...

http://binaryoptiontrade.sepdogewe.ml