380 likes | 500 Views
Overall Program Rating. Mean Rating = 8.05 (N = 58, out of 232, 25% response rate). On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate this CASRO program overall? . Educational Content. Mean Rating = 7.98 N = 58.
E N D
Overall Program Rating Mean Rating = 8.05 (N = 58, out of 232, 25% response rate) On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate this CASRO program overall?
Educational Content Mean Rating = 7.98 N = 58 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate the educational content?
Amount of Information Learned Mean Rating = 7.88 N = 58 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate the amount of information learned?
Networking Opportunities Mean Rating = 7.48 N = 58 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate the networking opportunities?
Why Register for This Event N = 58 Other reasons included Presenter/Speaker (3 mentions) and one mention each of New to Market Research, Networking, On Committee, and Sponsoring. Why did you register for this event?
Download Conference App N = 58 • Positive Comments • Good/Excellent/Worked Well General (5 mentions) • Liked Schedule (5 mentions) • Useful/Handy (4 mentions) • Good List of Attendees (2 mentions) • . • Negative Comments • Not Many Attendees Signed Up (6 mentions) • No Features/Information (3 mentions) • Did Not Work/Crashed (2 mentions) • Schedule Not Working (2 mentions) . Did you download and use the conference mobile app? Please provide us with any additional insights regarding your use of the mobile apps.
Philip Garland • Comments: • Didn’t really get his point. • Good presentation, which set the tone for the conference. However, I wish the "new way" in question had been better explained. Without more details, it seems like we just have to buy the "black box" methodology or not. • Good speech, looking forward to him releasing a tool for sale. • He was a good presenter even if I do not agree with him. • I always enjoy Philip's presentations. • I am concerned about the research process used in this paper. Unfortunately, he did have enough time to properly explain his methodology and actual calculation method, but I am wary of the finding. • Interesting and insightful. New information. • Interesting, radical. • Not bad, but also not the whole story, and he came across a touch arrogant. • Philip should be brought back again and again. But they need to change the name of that company! • Seems a lot of work to do on every survey. Too much judgment call in my opinion. • The overall message was clear, but the underlying analysis to support the theory could have been explained in greater detail. • This drew a misleading conclusion based on a weak test. You can't test the effect of speeding on a 5-minute survey. Speeding has already been proven to be highly correlated to other data quality issues, so this misleading and erroneous. • Too fast. There was more here I'm sure. • Too much data to contradict his findings. • Very interesting, but would have benefited from more time to dig even more into the math details. • Wish it had been longer. The implications of the research could have been made more clear. I loved the approach. Mean Rating = 7.78 N = 55 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Kerry Levin • Comments: • A little out of place. However, I think that the results were important. • Difficult to apply any of the knowledge shared to our business, and felt that would probably be the case for many in attendance. Topic was perhaps to narrow in focus and the learning from the research was not new. • Good. • Good new insights. A clever solution. • I like the idea of presenting a concrete study with the IRS. However, I also find this methodology somewhat flawed. They are pointing out the respondent's error and removing validity checks in the data. • Information interesting. Presentation could have been more clear. • Interesting, but not terribly relevant to me. • Interesting. • Not very applicable, and an expensive way to reach paper respondents. • Over my head... • Sample size of the outliers were very low (only 30 or so online) and didn't have a lot of relevance to what I do day in and day out. • Solid presentation. I liked that the author tried to resolve outliers instead of getting rid of them like other practitioners advise us to. The comparison with offline was not so relevant to this audience. • Super boring. • Technical, good to know, detailed. • This session was a little dry and could have been a bit more lively. • Thought it was a good clear piece, but perhaps left more questions than it answered about whether to deal with outliers or not (and how). • Took away the idea of adding follow-ups to numerical questions. Otherwise, probably a little less applicable. Mean Rating = 6.78 N = 54 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Frede & Suresh • Comments: • A trending topic that was the cause for much discussion following the presentation. Personally, I thought they did a great job but believe there is much work to be done in this area to improve validation methods. • Always have issues with Suresh calling people "bad" when they are just "unvalidated" :-) but as usual a solid piece from him. • An interesting topic. I would like to hear more about how the process of validation is being offered from their company specifically. • Boring, boring, boring. • Good overall, but the product needs to be more robust, including support of more countries. • I can not remember this one very well. • I thought this was an interesting discussion. I was not aware of the current validation processes in place at panel companies. • Interesting - but too much of a TrueSample advertisement. • Interesting ideas. • Seemed like a good approach to me. • Strong follow-up from last year's paper. • Suresh will not be happy until he gets rid of all non-normal respondents. After years of propaganda, no one in the audience cares to challenge the notion that identity control has anything to do with research. • Unclear what the "second phase validation" actually referred to. What was the data compared to? • Very important, and I am taking steps to implement the suggestions. • Very valuable new information. • When Susan Frede speaks, everyone shuts up and listens. She's another one that should be brought back again and again. Mean Rating = 7.48 N = 56 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Steven Gittelman • Comments: • Even more boring - same old stuff from Steve. • Gittelman is entertaining and smart. I learned a lot here. • He comes off as being too much of a salesman for his view of the world - and his consulting services. • Heard it before. • I enjoyed this presentation. I know that this will probably not happen, but I would love to know which panels are more inconsistent than others. • A nice speech. I would like to see the data, especially by panel if possible. • I really like how panels that are consistent are also more engaged. It is great that they go together. • I'm not sure Steve really proved his point, the rest I have seen many times before. • Interesting. Tough data to present but very interesting. • Not Mr. Gittelman's best work. We saw a lot of numbers, but I'm not sure what the end conclusion is this time. Too many contradicting results? Not enough data? Lack of universal truth? • Nothing new from Steve. • Steve did a good job applying his data to a new topic. Went very well with the other presentations. • Steve did a great job of making his research accessible and expanding upon what steps must be taken going forward. Mean Rating = 7.45 N = 51 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
McCrary & Veling • Comments: • Dull. Nothing new. • Fairly good, but I would like to see data on the real attrition rather than just the opt out rate. • I thought this was a great presentation with sound research. • I would disagree with the emphasis they put on their findings, but reasonably good work. • Interesting. • Interesting findings on maximum routing duration, although it felt like the analysis could have been refined. • Not actionable from a customer of panels perspective. • Panels need to find a solution to this - we as clients do not want to pay for a bad panel of professional respondents. • Should have done a great job of assuaging the concerns of those in attendance who shudder at the thought of routers. • Would have loved to see additional testing. Good paper. Mean Rating = 7.58 N = 48 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Johnson & Schiers • Comments: • A lot of relevant content. • Excellent paper. Strong multivariate methodology, good caveats for projectability of results. • Good. • Great presentation, but I was not fond of the use of the word "death" and "death of a panelist". • I am biased, but I thought this one was the best. • I thought this was interesting and educational for full service firms. • Interesting. • Not actionable from a customer of panels perspective. • The premise that some key experiential factors don't matter was based on people who made it past the first week. For those who left during the first week, the findings are different and should be explored. • I thought this was excellent and the kick off to a real dialogue about what we do to panelists/respondents. • Very interesting. Understanding panelists’ desire to stay on panel is important to me. We need more to stay. This was a little scary to learn how there are so many on the fence and we end up with professional survey takers...instead of fresh panel. • Very poor research, flawed conclusions. • I was very intrigued by the promise of the presentation but ultimately left wondering about the results of the research. I will certainly download the paper for further review as this is a topic that means a lot to us as we constantly look for ways to improve. Mean Rating = 7.72 N = 47 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Panel of Panelists • Comments: • Great • Outstanding • The best part of the conference • Great idea • Loved the concept • Interesting and scary at the same time • Outstanding job by the organizers • John Bremer was a great moderator of this session, worth the whole conference. • Scary to hear this from respondents • Really great • Enlightening and enjoyable • I loved it, very transparent • This was gold. So great to hear this content • Amazing, great session, well done • John Bremer was a great moderator—smart and very funny too • I am going to join more panels to get my wife a second home • Informative and very entertaining at the same time • How much did CASRO pay these people to attend? • Respondents are the lifeblood of our industry and it was great to hear from them firsthand • The standing ovation was well deserved, great session • The organizers selected an eclectic but accurate representation of respondents • Well organized and a great way to end the day • Very eye-opening • I always thought my panel was the best but apparently the Toluna panel is Mean Rating = 8.72 N = 55 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
George Terhanian • Comments: • Excellent stuff as one might expect - think it struggles through not being operationalized, and therefore more open than nailed down. • Eye-opening. • George did a great job making it clear how their approach works, however, not sure that it's the right answer. Look forward to hearing more about it as they conduct further research. • Good interesting approach, but I would love to see how it matches real purchase data. Basically, I am not convinced they have the right questions for studies. • I don’t like this approach - it leads us into data problems and bad respondents. This will be something panel companies need to address in the coming months. The speaker and topic were interesting though. • I don't need George Terhanian to spend 15 minutes on the history of online sample. I would have preferred more details about the methodology. Again, another black box that the audience can just buy or not by lack of transparency. • Intriguing, dense, need to do more checks on George. • More research is required. • Really interesting. I would like to hear about how this is being provided as a service. • Some great ideas for further thinking. • So-so. • This was unbelievably interesting. One of the best sessions I have been to at a CASRO conference. Very enlightening. • Very interesting approach. Would have liked more results, less primer on propensity weighting. Strong contribution. Mean Rating = 7.74 N = 47 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Nancy Brigham • Comments: • A nice talk, but not something I could do much with as a customer of panel suppliers. • Disappointing. It seems Ipsos was trying to keep some things confidential, but so much so, that it was hard to find anything to hold on to when it came to conclusions. • Excellent attempt at a very difficult subject. • Great information again on routers, but certainly were left with some questions about the methodology employed and how critical changes to their approach would affect the outcome of further research. Look forward to hearing more. • I liked the point that they needed more surveys in the router. • Informative. • Interesting. The key piece of information was the minimum number of studies for a router to have a non-bias effect. • It's about time! Wish she would have hit the importance of the results as much as the caveats. • Very complex experimental design - I am not sure I totally understood even though I read the paper. Sounds interesting, although I feel the individual results of each simulation should have been reported (not just the averages). • I would like to see the same analysis with blended sample sometime. Mean Rating = 7.63 N = 49 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
James Mendelsohn • Comments: • Great choice for a speaker. The fact that James was a researcher in a previous life made the speech all the more relevant. • Inspiring to plant ideas on how this can be used. • Interesting and strong presentation. • James did a good job setting the stage. • Lots of good overall points, but not a lot of specific information I could use. • One of the most interesting presentations at the conference, and I came away feeling good about MR's position at the table next to BD going forward. • Quite fascinating and a pleasure to hear someone from another discipline who thinks we have something to offer. • The absolute best presentation of the entire conference, in my humble opinion. • Very interesting topic. Little bit dry of a presenter. • We'll see where this leads or if it turns into a non-starter like mobile app surveys seem to be right now. There may be a backlash for too much intrusion into people's lives. • Went long - too macro for what was a very tactical & practical agenda. • What new information was presented here? Mean Rating = 8.07 N = 44 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Miller & Smith • Comments: • I didn't think that we accomplished much here. I think that the costs way outweighed any benefit of modular design. • Good solid stuff, but probably needed more time. • Great topic. Obviously, the experiment was harder than originally expected, but at least the authors had the courage to try. • Important topic that needs to be talked about more. • Interesting to see the idea pursued, but sad to see the tradeoffs are still too high to consider breaking up a survey like that yet. • Interesting, but it would have been better to dig deeper into how to consolidate the different chunks to one full survey result picture. • Renee was great and seems to be a real authority. • So-so. • This thoroughly worried me. I think it is useful that I was able to see what others are looking into, but this completely negates the point of survey RESEARCH. This was scary and, I think, wrong. • Very good. • We're still getting past the dip our toe in the water stage with mobile so for now I was not terribly interested one way or the other. There were some interesting data points, however. Mean Rating = 7.74 N = 47 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Kelly & Johnson • Comments: • Actually, we questioned why they would make someone download an app - seems counter intuitive to do so - if they are engaged enough to take the survey let them do so in a mobile environment (html 5 or something like that). • Again, great topic. Some of the choices of the experiments can be argued, but at least the authors tried something. I'm sure we'll see more of these in the next conferences. • Good ideas. • I liked this presentation more because it found that breaking up the survey was inappropriate. I liked that they combined mobile information into the experiment as well. • I thought that research has shown that currently mobile responders will take long surveys with no need to break it into tiny 2-minute chunks. • Interesting approach. • Interesting, but it would have been better to dig deeper into how to consolidate the different chunks to one full survey result picture. • Just getting into the review of how we can apply mobile to our business, so while the conversation was interesting, there wasn't much for us to take away other than a few notable numbers. • Overall good. Needed a better set up to describe the overall concept. • I think the research design let them down. Their survey was too short and the panelists perhaps conditioned to do full surveys at a single sitting. • This is great. Mean Rating = 8.02 N = 51 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Markowitz & Brandt • Comments: • Good speakers, but the content was a little thin. • I liked the specific examples of how to use the techniques. • I really liked the introduction on the future of research, but the examples shown seemed somewhat anecdotal (far from the disruption announced in introduction). Seriously, are you telling me the future of research is collages? • Interesting. • Not my thing – sorry. • Okay. • This session also worried me. It demonstrates how panel companies care more about respondent engagement and satisfaction than actual research. Qualitative research is not practical on a large-scale and cannot be used in complex analysis. Mean Rating = 7.60 N = 43 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Dan Seldin • Comments: • A correlation of four observations? • Good to know that the social media tracks data that we care about. • I thought this was an interesting presentation. This is definitely a question on many research firms minds and I think the findings validated initial premonitions - social media research is not yet useful, reliable, or valid. • This was interesting. Mean Rating = 7.70 N = 37 On a 1 – 10 scale with 10 being Excellent and 1 being Poor, how would you rate [insert]?
Respondent Profile Title: C-Level 4 mentions Partner 1 mention Vice President 19 mentions Director/Manager 29 mentions Analyst 3 mentions Years of Experience in this Position On average, respondents have held their current job position for 4.16 years. Years of Experience in Survey Research Industry On average, respondents have worked in the survey research industry for 11.38 years.
Total Experience & Insights Positive Comments* Good Content/Topics (20 mentions) Good Presenters (17 mentions) Interesting/Informative (14 mentions) Good Overall (12 mentions) Good Level/Mix of Attendees (9 mentions) Good/Comfortable Location (5 mentions) Good Panel of Panelists (4 mentions) Relevant (3 mentions) Well Organized (3 mentions)` When asked for additional insights, comments included: Positive Comments* Good Panel of Panelists (8 mentions) Good Information/Papers (7 mentions) Good Networking (7 mentions) Good Venue (4 mentions) Good Presenters (3 mentions) *Multiple mentions shown. • Negative Comments* • Poor Topics/Focus (6 mentions) • Poor Presenters (6 mentions) • Poor Vendor Layout (4 mentions) • Not Actionable Information (3 mentions) • Out of Date (2 mentions) • Poor Networking (2 mentions) • More Cocktail Reception (2 mentions) • . • Negative Comments* • Poor Networking (11 mentions) • Poor Slides/Graphics (2 mentions) • Poor Exhibition Area (2 mentions) • Poor Scheduling (2 mentions) . What specifically about the conference caused you to rate it [INSERT]? Please provide us with any additional insights regarding your experience at the conference.
Total Experience Verbatims Overall Program Rating: Excellent Fabulous papers! Loved the 'real time panelist' session as well. The location was great. The topics were great and very well presented and the panel of panelists was awesome. Good job by CASRO. Great content, engaging speakers, interesting dialog, good networking opportunities. Panel of panelists. The panel of panelists, and the overall content of the sessions. All the presentations and papers were done in a quality way and the topics were well selected. Almost every single paper was strong. The content will inform the work I do. Well run conference. Great speakers, great venue. This was my first CASRO conference. I was impressed by the overall quality of the programs and the overall environment at the Encore. Very informative. Organized well. Group size was good. I think there is always some room for improvement, but it was excellent overall. One of the best CASRO conferences ever. Would have been a perfect 10 if the online presence/coverage for the event had been better. Great presentations and papers, very comfortable setting, and good attendance. It was my first time and for the company. It was great to hear about trends and studies in US. Specially on routing as it is not as big in Japan. Well done overall. Good content, good food, on schedule. I am an unengaged respondent when it comes to verbatims. Lots of good presentations on some new topics. Really enjoyed the panel of panelists. Nice people, good conversations, good networking opportunity, interesting presentations. The content of the conferences and its relevance for our sector. Overall Program Rating: Good The program itself was great. Great speakers, and topics. There was not enough food though through out the conference. The lunch on the last day was excellent. More than one cocktail hour and more time in the exhibition hall would’ve been good, too. What specifically about the conference caused you to rate it [INSERT]?
Total Experience Verbatims Overall Program Rating: Good (Continued) I thought the conference material was well above average and the speakers were too. The hotel was very nice. I didn't feel there was enough time or opportunity to network with the other attendees and that is why I did not give the conference a 10. Overall thorough presentations. Content was relevant to me. I always learn very valuable information about the direction of the industry and what is going on during in the "panel" world. Also, the quality of speakers and attendees is top notch. It would have been nice to have some smaller, breakout sessions going into more details on certain topics. And perhaps a cocktail reception at the end of Thursday to encourage more folks to linger in the exhibit hall. It was good, but not off the charts. A couple of the presentations seemed to be old and a couple of the presenters did not seem to be able to answer questions regarding their own presentation. I really enjoyed the subject matter of the conference. It would have been nice to have the schedule of the demos, but I don't think running them in the same room during breaks were good - you could not hear them. There were too many discussions about the same type of topic. The program held some very valuable insight around routers, validation, and sourcing that our company can immediately utilize to improve our partnerships. Really great content and education experience, but tempered by the lack of business opportunity, and the couple of thinner presentations. Great papers by fine researchers - Terhanian and Smith in particular -- they are big draws. Good level of attendees and key decision makers. Strong program content and the panelist session with respondents was a refreshing change to the norm. Some good presentations and some so-so presentations. Some of the presenters were great while others were so-so. I enjoyed the discussions about innovations in the sampling process such as the use of routers, measuring respondent quality and ensuring consistency in tracking studies. I found all of the topics to be fairly interesting, even those that didn't necessarily apply to the company I work for. I thought the content was really good and, as a sponsor, we were able to meet the right level of executive. One thing that I didn't think worked so well, and this again is from the perspective of a sponsor, was the layout of the main networking room. I thought it was quite good overall. I think the main room with exhibitors could have been smaller in order to create a sense of a "community". What specifically about the conference caused you to rate it [INSERT]?
Total Experience Verbatims Overall Program Rating: Good (Continued) A bit too detailed around the mechanics of research vs. what the future is. Discussions were relevant but way too much on routers. Lots of great topics and great speakers. The papers covered topics that have been discussed for several years and findings were - more research is needed. I find this to be uninteresting. There was a great deal of valuable information shared, but the presentations were a bit dry ... it would be great to bring in a keynote or something to enhance the experience overall. Too much time spent on routers. The venue, most of the presentations. Beneficial, however not client facing for us. Great group, good topics, terrible layout in terms of vendors / service providers. I had hoped for more in the way of actionable information, but on the whole still enjoyed the conference. I have always enjoyed this conference and this one was strong as well. That said the papers presented weren't as actionable for my specific role this year (sample procurement). Well, I think it was informative. I learned what I "needed" to, but it did focus very prominently on the advanced analytics side of things - validation of methodologies, etc...routing, etc. I thought it was well-planned, with interesting papers. Some of it was over my head. It was a good conference with good content. It fails in comparison though to something like TDMR which was among the best online/technology related conferences I've experienced in a long time. Overall Program Rating: Satisfactory It was good, but was too much "talk" and not enough action in terms of CASRO taking a leadership role in defining solutions to some of the challenges we face. Most of the sessions weren't of interest to us being a programming and online data collection company. As an exhibitor attending by myself I only attended two sessions. It seemed a small sample to provide anything other than a neutral score. Some ultra boring presenters. What specifically about the conference caused you to rate it [INSERT]?
Additional Insights Verbatims Overall Program Rating: Excellent Would have loved the sessions to be on Wed/Thursday rather than Thursday/Friday as it seemed a lot of people left by Friday afternoon. A couple of the slides were not clear. This happens almost every year so some additional work should be done in that area to determine how the presentations will reflect on the bigger screens. Overall, an excellent event. I really liked the panel of panelists - so helpful to hear! And the featured speaker (on big data) was good. Great presenters and loved the panel of panelists. Well-paced and informative. Did miss the cocktail hour on night one for networking. Very insightful and professional. Great effort to stay on schedule. Even the food provided at breaks was great. It would be good if there was a guideline on the font size of slides. It was not easy to read many of the slides. The conference is getting to the stage where it could probably do with adding a few hours and having 1 more 23 paper session. Solid program. Great Cirque du Soleil show. Great venue, food and city Loved the panel of panelists. The consumer panel of panelists was very troublesome and i hope the industry responds accordingly and fast. We should do this every year. CASRO conferences are always a good opportunity for networking. But this one in particular is more relevant for us, given that we work with US panel companies a lot. Overall Program Rating: Good I didn’t get to see some people that I would've liked to see and meet because there were too many back to back sessions, not enough time in the exhibition hall, and no cocktail hours other than the LOVE event, which I did not attend because tickets were sold out. Meals were the only time to really meet other attendees and that was very limited to who was at your table. I was disappointed with the networking opportunities. It was refreshing to hear first hand, true accounts of a panelists life cycle and how they view our industry. The content is of a high caliber and highly useful. I learned a lot of new information about research and potential paths that companies are exploring. I would love to see a perspective related to the full-service firm and thoughts/opinion on panel practices. Please provide us with any additional insights regarding your experience at the conference.
Additional Insights Verbatims Overall Program Rating: Good (Continued) I would have had the exhibitor presentations be in a breakout room. I always like this conference because of the papers presented. I miss the Q&A with the presentations. It seemed less engaging this year. Networking opportunities would have been improved a lot by having more organized social time, such as drinks together or so. There was plenty of opportunity to meet with the vendors, which was nice. I would have liked to hear about different areas of the industry. It would be beneficial to build in more time/opportunities for networking in/outside of the conference framework. The Beatles event for example is fun, but not necessarily ideal for getting to talk to a number of the folks you'd like to catch up with. Networking was low simple because it's the same people every year. I like that this is where we share best practices, though, with real transparency, so I didn't give that too much weight. I brought a couple people from my team, and we wound up having kind of a team building experience -- a bonus. Good for networking but a disappointing exhibitor area and format. Facility and location was excellent. I loved the panel of panelists, but I thought that they could have been more representative of panelists in general. They only took from those that are a member of a ton of panels. Well over half the panelists on many panels do not join multiple panels. Much of the content is widely known but not talked about regularly so it was nice to have that forum. Networking opportunities are always great in settings like these. It was great to meet with some of the vendors we work with on a regular basis and to learn about new opportunities from some of the attendees. Generally, it was a very good show and we're looking forward to sponsoring the CASRO technology event in May. Overall good, but it would be great if there was a way to make it more interactive. The Q&A sessions were great. Maybe there is a better way to manage that. Having Q&A after a few speakers seemed to dilute the questions. The papers were excellent. Like to have more of them. The conference should set up some type of speed networking program for the next event. Wonderful venue. I'm so grateful to have had the opportunity to experience that hotel. It was stunning. The conference was very short. I had to leave the sessions to network unfortunately. The panel of panelists was awesome. It was good to reconnect with people. Lacked opportunities that encouraged or better yet, forced interactions between everyone at the conference. Please provide us with any additional insights regarding your experience at the conference.
Additional Insights Verbatims Overall Program Rating: Good (Continued) The networking was good, the amount of things that could be acted upon could have been higher. I missed breakfast/lunch/dinner being provided on all three days. Since we were in Vegas, it was fun to go out to eat, but in the past I appreciated meals being provided. The organization of the sign up - times of the APP, etc. were a little off - breakfast on the first day was late...seemed like many exhibitors were not even set up by 10:00. Couldn't hear people at certain points. It was my first time attending the CASRO conference, and I learned a great deal. Much of the presented material was not relevant to my own field, but it was well-researched and well-delivered. Vegas is so hard to get home from. I'm almost always forced to stay another day or take a red eye. Neither of which are very appealing. Overall Program Rating: Satisfactory More time for 1-on-1 meetings would be nice, felt that I had to make a tradeoff between having important meetings, and attending the sessions. As an exhibitor there weren't a lot of opportunities for networking and we didn't see a lot of traffic at the booth. I deeply dislike hard sell and didn't receive it from organizers or other exhibitors. The environment seemed more cooperative than competitive which provided a refreshing change. Some of the information was stale - relating to topics that have been discussed over and over and over again, but in the same context. Please provide us with any additional insights regarding your experience at the conference.
Other Comments & Suggestions Positive Comments* Good Overall (4 mentions) Good Panel of Panelists (2 mentions) Negative Comments* Need Better Exhibition Area (6 mentions) Need Different Topics (3 mentions) Need More/Better Networking (2 mentions) Need More Sessions (2 mentions) Poor Breakfast Sessions (2 mentions) *Multiple mentions shown.
Comments & Suggestions Verbatims • Overall Program Rating: Excellent • Great event, great people. Always a huge amount to learn. • The Love Show event was great! Good job all around! • Thanks to Bob, Olivier, and the CASRO team for a very successful event. Well planned and well executed! • Questions should be done after each presentation. • Perhaps less Routing next time. • Video/audio recordings would be great. • The conference program was very Panel centric. I would have liked to see a little more diversity into other aspects of Online technologies. • Move it around more, two years per venue is too much. • On the event page, it would be good to be able to download the program in a calendar format. Not just the start and end date like currently offered, but individual times for each presentation. • First, it would be great if each session of the conference was available as a separate Outlook event. The current setup simply blocks off your calendar for the two days of the conference. • In future I look forward for more number of attendees so that sponsor exhibit area won't be empty while sessions are on. • Each year I think it can't be topped, but it is. • The exhibition was not very well positioned. The stands where kind of lost in such a big room. maybe putting all of them together, as in the Annual Conference, would work better • Overall Program Rating: Good • Again, the panel of panelists was great. I would love to hear the panels' take on the input • As previously mentioned, I would like to see a full-service firm's perspective on panel company behavior and research. Also, I know why the second day starts so early, so that attendees can fly out, but I inevitably have work that needs to be done and it • It was a terrific hotel. • Vegas was very expensive! • See my first comment: more organized social events. • It was a great conference and appreciate having all the papers available to us. • Either spread out discussions about similar topics or provide a variety of topics. • Promote the use of your Facebook event pages to encourage pre/post conference dialogue and more readily allow attendees to coordinate schedules, share feedback, and increase awareness of the event.
Comments & Suggestions Verbatims • Overall Program Rating: Good (Continued) • The program on the first day was pretty weak. • Better exhibition area or none at all. More inclusive networking opportunities in the evenings. More interactive panel sessions. More sessions with respondents! • I think the exhibitors presentation set up was truly horrible. Having folks talk during these presentations was just inconsiderate to those presenting but the set up of breakfast invited that. That session was not thought out at all. • I hope the conference mobile app will be made available for Windows Phone 7! • Perhaps have different vendors exhibit on the second day. After the first day I had met with everyone that I wanted to speak with, so I felt that this time was wasted on the second day. • The only thing I would change is the room set-up for the sponsors. As I previously mentioned, it would have been good to drive a bit more traffic to the tables. • Previous questions was poorly constructed. I downloaded the app bound did not find it useful and did not use it. Most likely because Blackberry apps are rarely easy to use. • Should be a longer event, should have more presentations or seminars, and there should have been a sponsored dinner one of the nights. • The morning presentations over breakfast were terrible. The venue and noise made it so you could not hear a thing. It was not good • It was my first CASRO event and I really learned a lot. I was a bit surprised that so much focus was placed on panel quality as opposed to online techniques, though. I'd like to have seen more about emerging methodologies in practice. • Great! • Vendor Demo was a good / great idea, but very poorly executed. Casual nature combined with size of the hall didn't work. • I enjoyed the panel on panels. • I would combine this with the global conference in future years so that we could have one conference on sample, suppliers, online, and global. They are very intertwined. • Overall Program Rating: Satisfactory • I personally would like to see CASRO take a firm stance on defining some industry standards • The idea of the exhibitor showcase was really great, but it didn't seem like a lot of people paid attention. It would be great if the showcase could be in a separate room and the people that were interested could go and listen to the exhibitors. • As an exhibitor additional draws for attendees, such as open bar can be helpful.
Other Topics Multiple Mention TopicsShown More From Respondents (6 mentions) Mobile Surveying (6 mentions) Routing (4 mentions) Social Media (4 mentions) Survey Design (4 mentions) Multi-Mode Research (3 mentions) Emerging Markets/International Markets (3 mentions) Sample Blending/Partnering (3 mentions) Emerging Technologies (2 mentions) “Gamification” (2 mentions) Panel Creation (2 mentions) Panel of Panelists (2 mentions) Panelist Incentives (2 mentions)
Other Topics Verbatims Bias due to sampling process. Blending and integrating disparate data and methodologies to create a holistic report. Comparing scales/results across countries on a global study. Emerging methodologies, appropriate applications, and seeing actual results Frede, Smith, Terhanian. Further research on breaking up trackers Gamification, multi-mode research (web, phone, etc.) I like crystal ball views - where is the industry going for example. I think for the last few years we have heard a lot from the panel companies and this year from respondents. As a buyer/client, I would like to see a session where we can explore more buyer/client needs and discuss the impacts of technology and developing I think we need to look more closely at the impact of "professional respondents" on data representivity. I would like to hear a full service firms perspective on panel practices. I would like to hear from more actual respondents as well. I would like to hear more on routing design and process, social media in research and in routing, and continued mobile t I'd love even more about mobile. And focused sessions on interactivity within a survey and if that affects respondent outcome. International projects and emerging markets (specially Asia) It would have been interesting to hear more about panelist incentive programs. Love the online quality debate - think there's more work to be done here still. Most studies were in initial phases. More advanced studies needed. Loved the format pragmatic v experimental More focus on mobile and sample blending issues More information on mobile More on mobile More on routers. More on routing and river sampling. More on data quality. More on panelists experiences. More on sampling and weighting of online samples/panels. More on survey engagement/gamification techniques More respondent focus - possibly in the medical arena as that is where our company focuses More respondent insights
Other Topics Verbatims (Continued) More social media. More topics on who panelist are and their behavior. Emerging technologies. Multi-mode methodologies Multi-mode survey design New technologies; validation; social media as sample source Online panel creation/Custom Panel Creation and Management Best Practices. Overview of the technology stacks people are using from each company - as case studies. for instance, is it confirm it for data collection, in-house panel management, off the shelf reporting, etc? are people more LAMP based, .NET, or Java centered? what Panel on panelists Panel overlap, how widespread is it, what can be done to clean it up, and when? Survey design, a sharing of best practices to improve overall respondent experience. Panel sourcing and how panels are built, along with additional focus on router discussions. Panel vs. River - Pros & Cons and how can they play nice together. Data Quality - Let's stop talking about new things we could be doing, and focus on agreeing to an industry standard based on things available today. Predictive analytics Qualitative research online!!!!! A whole world of opportunities Quality of surveys when LOI is longer than 20 minutes. Respondent quality, sample blending, moving from offline to online in developing regions. Router Sample and the impact on study results? River sample and the impact on study results. Survey design according to different markets Survey design the impact of it on respondents Survey design, Best Practices for survey moderators/builders. How to best communicate the value of online research to clients not used to it. Techniques for evaluating panel quality in emerging markets (India, China, Brazil, Mexico, etc.) That's something I need to think about more The panel of panelists was outstanding. I think some more topics as they related to what the panelists identified as what is important to them would be great. 1. Incentives - new opportunities to evolve compensation for the online channel. 2. Live Pan The quality downfall of "partnering panels“
Other Topics Verbatims (Continued) There were some conflicting presentations on the topic of mobile surveys with regards to respondent engagement and participation. It would be great to see some more research done on that topic. Think we need to nail down what we think about social media, more on respondent experience We should be facing what was indicated by the panel of panelists on Thursday afternoon. The general conception was that these panelists -- even though they were carefully selected -- represented a not so pleasant respondent crowd: They were all what we co Whatever topics impact the online research industry. Would enjoy hearing more again about outlier detection. Would also like to see more work about increasing the representativeness of the data we use.
Conference App Verbatims • Loved it, though couldn't get 'my schedule' to work. • Not enough persons in my contacts used it. A couple of the foreign participants said it did not work when they arrived in the US. • The schedule was helpful. • It was ok, but it didn’t have any additional features. It was kind of plain. • Surprised by the lack of Twitter activity, Would be great to have a CASRO moderator feeding the social networks. • It was large, it crashed (a lot) and linked it's programmed time to your phone time - great when in Vegas but tough to use in advance. :-) • Would be more useful if I could be notified when sessions I'm interested in are going to start soon. Also, to notify me when someone I want to meet is nearby. • Not many people used that member part? It would be great if attendees share more info on the app or twitter • Love the app. • I couldn't use it during the conference because there was no Wi-Fi single in the conference room. I need Wi-Fi for my iPad. • Didn't seem like a lot of contacts signed up - I'd like to see more next year. • It had no content at all. another colleague coming also from abroad had the exact same problem. I was using an iPhone and he was using an Android and the same thing happened. • App was great for a list of attendees but I could not seem to utilize the 'my contact' area. I would click share contact info and nothing seemed to happen. • It was useful to have and be able to keep track of the presentations and schedule on my phone. • It was very handy and well done. • Loved the app, helped with scheduling. • We expected to see all attendees listed in the App, so we knew who was there, but that didn't happen. • Very handy to have the schedule on my phone. I liked it a lot - but I understand those things are expensive. • I liked that I didn't have to carry around a copy of the schedule. • I only used it for the schedule. • Needed more information in the app. • It was very well done. I wish more had taken advantage of the contact list and shared information for networking purposes. Please provide us with any additional insights regarding your use of the mobile apps.
Conference App Verbatims (Continued) • It didn't load well most of the time • I found it kind of limited. The map was nice, but it seemed to load slowly as it downloaded updated tweets. • Your times were off - that led to some confusion about when to show up for certain things like registration, etc. Would have been nice to be able to message someone else at the conference via the app. • The app worked nicely. • It was great! Made the whole process much simpler. • Not enough people put their contact info in and shared it with people; probably didn't know they had to go in and do it • Navigation was somewhat confusing. I would have liked to have seen pictures along with the attendee lists. . . and LinkedIn URLs. Please provide us with any additional insights regarding your use of the mobile apps.