Embedding Training Within Warnings Improves Skills of Identifying Phishing Webpages

Author(s):  
Aiping Xiong ◽  
Robert W. Proctor ◽  
Weining Yang ◽  
Ninghui Li

Objective: Evaluate the effectiveness of training embedded within security warnings to identify phishing webpages. Background: More than 20 million malware and phishing warnings are shown to users of Google Safe Browsing every week. Substantial click-through rate is still evident, and a common issue reported is that users lack understanding of the warnings. Nevertheless, each warning provides an opportunity to train users about phishing and how to avoid phishing attacks. Method: To test use of phishing-warning instances as opportunities to train users’ phishing webpage detection skills, we conducted an online experiment contrasting the effectiveness of the current Chrome phishing warning with two training-embedded warning interfaces. The experiment consisted of three phases. In Phase 1, participants made login decisions on 10 webpages with the aid of warning. After a distracting task, participants made legitimacy judgments for 10 different login webpages without warnings in Phase 2. To test the long-term effect of the training, participants were invited back a week later to participate in Phase 3, which was conducted similarly as Phase 2. Results: Participants differentiated legitimate and fraudulent webpages better than chance. Performance was similar for all interfaces in Phase 1 for which the warning aid was present. However, training-embedded interfaces provided better protection than the Chrome phishing warning on both subsequent phases. Conclusion: Embedded training is a complementary strategy to compensate for lack of phishing webpage detection skill when phishing warning is absent. Application: Potential applications include development of training-embedded warnings to enable security training at scale.

Circulation ◽  
2016 ◽  
Vol 133 (suppl_1) ◽  
Author(s):  
Nancy R Cook ◽  
Lawrence J Appel ◽  
Paul K Whelton

Introduction: Although weight loss has favorable effects on intermediate outcomes, such as blood pressure and insulin resistance, few studies have examined its effects on long-term outcomes including total mortality. Methods: In the Trials of Hypertension Prevention (TOHP) individuals aged 30-54 years with high normal BP were randomized to a weight loss intervention, to one of several other lifestyle or dietary supplement interventions, or to usual care. All participants from Phase 1 (1987-90) and Phase 2 (1990-5) were followed for mortality through 2013. The association of weight change during any of the interventions with long-term mortality up to 18-24 years after the trial periods was examined among 3828 participants who fell into a high baseline weight stratum, defined as body mass index at least 26 kg/m2 in men and 24 kg/m2 in women. Results and Conclusions: There were 1477 high-weight participants in Phase 1 and 2351 in Phase 2, of whom 21% and 50%, respectively, were assigned to a weight loss intervention. Overall, mean weight change during the trial period was -1.8 lbs (-0.8% of baseline body weight) over 1.5 years in Phase 1 and 1.6 lbs (0.8%) over 3-4 years in Phase 2. A total of 556 (15%) lost > 5%, 1,101 (29%) lost <=5%, 1,567 (41%) gained less than 5%, and 604 (16%) gained > 5% in body weight. Corresponding hazard ratios (HRs) for total mortality were 0.82 (95% confidence interval (CI)=0.57-1.18), 0.94 (95% CI=0.72-1.23), 1.00 (reference), and 1.29 (95% CI=0.92-1.80) (p-trend = 0.046). There was a direct linear relationship with percent change in weight during the trial period and later mortality (HR=1.14 per 5% change, 95% CI=1.02-1.28, p=0.019). This association persisted throughout the course of mortality follow-up (Figure). In these healthy individuals taking part in lifestyle and nutrition supplement trials , short-term weight change was directly associated with mortality about two decades later. These results are consistent with a long-term beneficial effect of presumed intentional weight loss on total mortality.


2021 ◽  
Author(s):  
Kohei Hotta ◽  
Masato Iguchi

Abstract We herein propose an alternative model for deformation caused by an eruption at Sakurajima, which have been previously interpreted as being due to a Mogi-type spherical point source beneath Minami-dake. On November 13, 2017, a large explosion with a plume height of 4,200 m occurred at Minami-dake. During the three minutes following the onset of the explosion (November 13, 2017, 22:07–22:10 (Japan standard time (UTC+9); the same hereinafter), phase 1, a large strain change was detected at the Arimura observation tunnel (AVOT) located approximately 2.1 km southeast from the Minami-dake crater. After the peak of the explosion (November 13, 2017, 22:10–24:00), phase 2, a large deflation was detected at every monitoring station due to the continuous Strombolian eruption. Subsidence toward Minami-dake was detected at five out of six stations whereas subsidence toward the north of Sakurajima was detected at the newly installed Komen observation tunnel (KMT), located approximately 4.0 km northeast from the Minami-dake crater. The large strain change at AVOT as well as small tilt changes of all stations and small strain changes at HVOT and KMT during phase 1 can be explained by a very shallow deflation source beneath Minami-dake at 0.1 km below sea level (bsl). For phase 2, a deeper deflation source beneath Minami-dake at a depth of 3.3 km bsl was found in addition to the shallow source beneath Minami-dake which turned inflation after the deflation obtained during phase 1. However, this model cannot explain the tilt change of KMT. Adding a spherical deflation source beneath Kita-dake at a depth of 3.2 km bsl can explain the tilt and strain change at KMT and the other stations. The Kita-dake source was also found in a previous study of long-term ground deformation. Not only the deeper Minami-dake source MD but also the Kita-dake source deflated due to the Minami-dake explosion.


Nutrients ◽  
2020 ◽  
Vol 12 (10) ◽  
pp. 2911
Author(s):  
Angelo Tremblay ◽  
Maëlys Clinchamps ◽  
Bruno Pereira ◽  
Daniel Courteix ◽  
Bruno Lesourd ◽  
...  

Objectives: This study was performed to evaluate the long-term maintenance of nutritional changes promoted during an intensive initial intervention to induce body weight loss. The ability of these changes to predict long-term health outcomes was also examined. Methods: Nutritional variables, body composition, and metabolic markers collected in the RESOLVE project were analyzed before and after a 3-week intensive diet–exercise intervention (Phase 1), and during a subsequent supervision under free living conditions, of 12 months (Phase 2). Results: As expected, the macronutrient composition of the diet was modified to promote a negative energy balance during Phase 1. The decrease in carbohydrates imposed during this phase was maintained during Phase 2 whereas the increase in protein intake returned to baseline values at the end of the program. Dietary fiber intake was almost doubled during Phase 1 and remained significantly greater than baseline values throughout Phase 2. Moreover, fiber intake was the only nutritional variable that systematically and significantly predicted variations of health outcomes in the study. Conclusion: The adequacy of dietary fiber intake should be a matter of primary consideration in diet-based weight reduction programs.


CJEM ◽  
2018 ◽  
Vol 20 (S1) ◽  
pp. S27-S27
Author(s):  
C. Leafloor ◽  
P. Jiho Hong ◽  
L. Sikora ◽  
J. Elliot ◽  
M. Mukarram ◽  
...  

Introduction: Approximately 50% of patients discharged from the Emergency Department (ED) after syncope have no cause found. Long-term outcomes among syncope patients are not well studied, to guide physicians regarding outpatient testing and follow-up. The objective of this study was to conduct a systematic review for long-term (one year) outcomes among ED patients with syncope. We aim to use the results of this review to guide us in prospective analysis of one year outcomes with our large database of syncope patients. Methods: We searched Cochrane Central Register of Controlled Trials, Medline and Medline in Process, PubMed, Embase, and the Cumulative Index to Nursing and Allied Health Literature (CINAHL) from the inception to June, 2017. We included studies that reported long-term outcomes among adult ED patients (16 years or older) with syncope. We excluded studies on pediatric patients, and studies that included syncope mimickers: pre-syncope, seizure, intoxication, loss of consciousness after head trauma. We also excluded case reports, letters to the editor and review articles. Outcomes included death, syncope recurrence requiring hospitalization, arrhythmias and procedural interventions for arrhythmias. We selected articles based on title and abstract review during phase-1 and conducted full article review during phase-2. Meta-analysis was performed by pooling the outcomes using random effects model (RevMan v.5.3; Cochrane Collaboration). Results: Initial literature search generated 2094 articles after duplicate removal. 50 articles remained after phase-1 (=0.85) and 16 articles were included in the systematic review after phase-2 (=0.86). The 16 included studies enrolled a total of 44,755 patients. Pooled analysis at 1-year follow-up showed the following outcomes: 7% mortality; 14% recurrence of syncope requiring hospitalization; one study reported that 0.6% of patients had a pacemaker inserted; and two studies reported 0.8 11.5% of patients suffered new arrhythmias. Conclusion: An important proportion of ED patients with syncope suffer outcomes at 1-year. Appropriate follow-up is needed to prevent long-term adverse outcomes. Further prospective research to identify patients at risk for long-term important cardiac outcomes and death is needed.


Blood ◽  
2014 ◽  
Vol 124 (21) ◽  
pp. 41-41
Author(s):  
John Koreth ◽  
Haesook T. Kim ◽  
Kyle T. Jones ◽  
Carol G Reynolds ◽  
Marie J Chammas ◽  
...  

Abstract Chronic graft-versus-host disease (cGVHD) after allogeneic hematopoietic cell transplantation (HCT) results from incomplete reconstitution of immune tolerance. CD4+CD25+FOXP3+ regulatory T cells (Treg) are required for tolerance and function as dominant suppressors of innate and adaptive immune effector cells. In our prior phase 1 cGVHD study daily subcutaneous (SC) low-dose interleukin-2 (IL-2) for 8 weeks induced Treg expansion in vivo and objective clinical responses in 12 of 23 evaluable participants (NEJM 2011). We now report on a phase 2 trial of daily low-dose SC IL-2 at 1x106 IU/m2/d for 12 weeks in steroid-refractory cGVHD. The study comprised 35 HCT recipients (51% male, 91% HLA-matched PBSC grafts). Median participant age was 51 years (range, 22-72). Median time from HCT and from cGVHD onset to start of IL-2 treatment was 616 days (range, 270-2145) and 252 days (range, 28-1880) respectively. Participants had a median of 4 cGVHD organ sites (range, 1-7), and 2 concurrent cGVHD therapies (range, 1-3) at enrollment. The median baseline prednisone dose was 20 mg (range, 2.5-50). The median follow-up in survivors was 21 months (range, 4-35). 12 week low dose IL-2 was well tolerated: 2 participants withdrew and 5 required IL-2 dose reduction for constitutional AE (n=6) and thrombocytopenia (n=1); 1 had Gr 3 infection (bacteremia); and none experienced relapse. At week 12, objective cGVHD responses (PR) were documented in 21 of 33 evaluable participants (64%). Two (6%) had cGVHD progression. cGVHD response sites included skin (n=9), joint/fascia/muscle (n=4), liver (n=7), lung (n=3), and GI tract (n=4). Overall 2-year OS/PFS was 91% (responders 94%; non-responders 83%). 23 participants with clinical benefit (PR or SD with minor response) proceeded on extended IL-2 therapy. Immunologically, low dose IL-2 induced a >4-fold increase in median Treg count/µL (p<0.001): a rapid rise from a baseline of 17.1 (Q1-Q3, 8.6-40.6) to a week 4 peak of 137.9 (Q1-Q3, 51.8-188) and subsequent stabilization with a week 12 count of 104.1 (Q1-Q3, 53.9-167.1). No significant change in CD4 conventional T cell (Tcon), CD8 T cell, or CD20 B cell count was noted. NK cell count increased >3-fold (p<0.001). The median Treg:Tcon ratio increased >4-fold (p<0.001): a rapid rise from baseline of 0.06 (Q1-Q3, 0.05-0.13) to a week 2 peak of 0.35 (Q1-Q3, 0.26-0.48) that remained elevated through a week 12 ratio of 0.31 (Q1-Q3, 0.27-0.39) (Figure). Treg count and Treg:Tcon ratio declined during 4 weeks off IL-2 and rose thereafter on restarting IL-2. Clinical responders were younger (50 vs. 61.5 years, p=0.01) and initiated IL-2 earlier (499 vs. 903 days post HCT, p=0.015). Responders had a higher median Treg:Tcon ratio at study baseline (0.09 vs. 0.06, p=0.052) and at week 1 of IL-2 (0.3 vs. 0.14, p=0.01). Combining phase 1 and 2 data, Treg:Tcon ratios of ≥0.07 at baseline and ≥0.2 at week 1 of IL-2 were highly predictive of clinical response (p=0.007; p=0.0013 respectively). The combined phase 1 and 2 extended IL-2 cohort comprised 35 participants with a median follow up of 16.2 months (range, 4.1-66.8), with 20 and 12 participants receiving over 1 and 2 years of IL-2 respectively. Extended daily low dose IL-2 was well tolerated, and only 4 participants had Gr 3 AEs deemed IL-2 related: lung infection (n=1), arthralgia (n=1), and injection site induration (n=2). 5 participants required IL-2 dose reduction and 1 had hematologic malignancy relapse. Clinical responses were typically sustained during taper of concomitant immunosuppression. Treg augmentation persisted for the duration of IL-2 therapy. Tcon count slowly recovered to normal levels and Treg:Tcon ratio gradually normalized over a 2 year period. There was no change in CD8 count. The median steroid taper was 50% (range, -20-100). In summary, daily low dose IL-2 therapy induced profound Treg enhancement, and clinical responses in over half of refractory cGVHD patients. Early clinical response predictors suggest IL-2 is more effective earlier in the cGVHD course and when starting numbers of Treg are higher. Sustained clinical and immunologic response during extended IL-2 was documented. Long term tolerance induction with daily low dose IL-2 is a promising and feasible strategy. Optimizing IL-2 clinical response by further augmenting Treg and the Treg;Tcon ratio early in the course of cGVHD is worth exploring. Figure 1 Figure 1. Disclosures Koreth: Prometheus Laboratories Inc: Research Funding; Millennium Pharmaceuticals Inc: Research Funding; Takeda Pharmaceuticals Inc: Membership on an entity's Board of Directors or advisory committees. Off Label Use: Low-dose Interleukin-2 for immune tolerance. Chen:Bayer Pharmaceuticals, Inc.: Other, Research Funding. Avigan:Astex Pharmaceuticals : Research Funding.


2018 ◽  
Vol 43 (1) ◽  
pp. 47-54 ◽  
Author(s):  
Rhys James Williams ◽  
Atsushi Takashima ◽  
Toru Ogata ◽  
Catherine Holloway

Background: Thermal discomfort among lower-limb prosthesis wearers is prevalent with social and medical consequences. Objectives: This study aimed to verify the feasibility of out-of-laboratory thermal comfort studies. Study design: Repeated measures pilot study. Methods: Thermistors were placed on participants’ residual limbs during two experimental phases. In phase 1, mean limb temperature was calculated over a controlled 55-min rest-exercise-rest protocol. In phase 2, participants conducted activities of their choosing wherever they wanted away from the lab, while limb temperature data were collected. Descriptive statistics and statistical differences between phases are presented. Results: Five male amputees participated with an average age ±standard deviation of 30 ± 9 years. In phase 1, mean limb temperature change ranged between 1.6°C and 3.7°C. In phase 2, mean limb temperature change ranged between 1.8°C and 5.1°C. Limb temperature was significantly higher in out-of-lab studies (+1.9°C, p = 0.043) compared to in-lab studies. Conclusion: Independent multiple-hour temperature studies are shown to be feasible. Results also indicate that out-of-lab residual limb temperature can be significantly higher than in-lab temperatures. Clinical relevance Thermal discomfort and sweating may lead to skin conditions and reduce quality of life among prosthesis wearers. Out-of-lab, long-term temperature studies are needed to comprehensively characterize thermal discomfort to create preventive solutions.


2021 ◽  
Vol 12 ◽  
Author(s):  
Zhihao Tu ◽  
Helena de Fátima Silva Lopes ◽  
Takashi Narihiro ◽  
Isao Yumoto

Indigo fermentation fluid maintains its indigo-reducing state for more than 6 months under open-air. To elucidate the mechanism underlying the sustainability of this indigo reduction state, three indigo fermentation batches with different durations for the indigo reduction state were compared. The three examined batches exhibited different microbiota and consisted of two phases. In the initial phase, oxygen-metabolizing-bacteria derived from sukumo established an initial network. With decreasing redox potential (ORP), the initial bacterial community was replaced by obligate anaerobes (mainly Proteinivoraceae; phase 1). Approximately 1 month after the beginning of fermentation, the predominating obligate anaerobes were decreased, and Amphibacillus and Polygonibacillus, which can decompose macromolecules derived from wheat bran, were predominantly observed, and the transition of microbiota became slow (phase 2). Considering the substrate utilization ability of the dominated bacterial taxa, the transitional change from phase 1 to phase 2 suggests that this changed from the bacterial flora that utilizes substrates derived from sukumo, including intrinsic substrates in sukumo and weakened or dead bacterial cells derived from early events (heat and alkaline treatment and reduction of ORP) to that of wheat bran-utilizers. This succession was directly related to the change in the major substrate sustaining the corresponding community and the turning point was approximately 1 month after the start of fermentation. As a result, we understand that the role of sukumo includes changes in the microbial flora immediately after the start of fermentation, which has an important function in the start-up phase of fermentation, whereas the ecosystem comprised of the microbiota utilizing wheat bran underpins the subsequent long-term indigo reduction.


2020 ◽  
Author(s):  
Kohei Hotta ◽  
Masato Iguchi

Abstract We herein proposed an alternative model for deformation caused by each eruption at Sakurajima, which have been previously interpreted as being due to a Mogi-type source beneath Minami-dake. On November 13, 2017, a large explosion with a plume height of 4,200 m occurred at Minami-dake. During the three minutes following the onset of the explosion (November 13, 2017, 22:07–22:10 (Japan standard time (UTC+9); the same hereinafter), phase 1, a large strain change was detected at the Arimura observation tunnel (AVOT) located approximately 2.1 km southeast from the Minami-dake crater. After the climax of the explosion (November 13, 2017, 22:10–24:00), phase 2, a large deflation was detected at every monitoring stations due to the continuous Strombolian eruption. Subsidence toward Minami-dake was detected at five out of six stations whereas subsidence toward the north of Sakurajima was detected at the newly installed Komen observation tunnel (KMT), located approximately 4.0 km northeast from the Minami-dake crater. The large strain change at AVOT during phase 1 can be explained by a very shallow deflation source beneath Minami-dake at 0.1 km below sea level (bsl). For phase 2, a deeper source beneath Minami-dake at a depth of 3.3 km bsl deflated in addition to the shallow source beneath Minami-dake, which turned inflationary after the deflation obtained during phase 1. However, this model cannot explain the tilt change of KMT. Adding a spherical deflation source beneath Kita-dake at a depth of 3.2 km bsl can be explain the tilt and strain change at KMT and the other stations. The Kita-dake source was also found in a previous study of long-term ground deformation events. Not only the deeper Minami-dake source M D but also the Kita-dake source deflated due to the Minami-dake explosion.


2021 ◽  
Vol 29 ◽  
pp. S253-S254
Author(s):  
D. Hunter ◽  
A. Mobasheri ◽  
S. Mareya ◽  
M. Wang ◽  
H. Choi ◽  
...  
Keyword(s):  
Phase 1 ◽  
Phase 2 ◽  
Phase 3 ◽  

2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 9071-9071
Author(s):  
Scott N. Gettinger ◽  
Rudolf M. Huber ◽  
Dong-Wan Kim ◽  
Lyudmila Bazhenova ◽  
Karin Holmskov Hansen ◽  
...  

9071 Background: BRG is a kinase inhibitor approved for the treatment of patients (pts) with ALK+ metastatic NSCLC; specific details for BRG use vary by indication and country. We report long-term efficacy and safety results of the Phase 1/2 and Phase 2 (ALTA) trials of BRG. Methods: The Phase 1/2 study was a single-arm, open-label trial (NCT01449461) of BRG 30–300 mg/d in pts with advanced malignancies. ALTA (NCT02094573) randomized pts with CRZ-refractory ALK+ NSCLC to receive BRG at 90 mg qd (arm A) or 180 mg qd with 7-d lead-in at 90 mg (arm B). For the Phase 1/2 study, investigator assessments of confirmed objective response rate (cORR; RECIST v1.1), duration of response (DoR), progression-free survival (PFS), overall survival (OS), and safety in pts with ALK+ NSCLC are reported. The primary endpoint of ALTA was cORR per investigator; secondary endpoints included cORR per independent review committee (IRC), DoR, PFS, and OS. Results: In the Phase 1/2 study, 137 pts received BRG; of these, 79 pts had ALK+ NSCLC (71/79 had prior CRZ; 28/79 received 180 mg qd [7-d lead-in at 90 mg]; 14/79 received 90 mg qd). In ALTA, 222 pts with CRZ-refractory ALK+ NSCLC were randomized (n = 112/110, arm A/B). At the end of the Phase 1/2 study (Feb 18, 2020), with median 27.7 mo follow-up (̃67 mo after last pt enrolled), 4 pts remained on BRG. At the end of ALTA (Feb 27, 2020), with median 19.6/28.3 mo follow-up in arm A/B (̃53 mo after last pt enrolled), 10/17 pts in arm A/B were still on treatment. Table shows efficacy results from final analyses with long-term follow-up. In ALTA, the IRC-assessed intracranial cORR in pts with measurable baseline brain metastases was 50% (13/26) in arm A and 67% (12/18) in arm B; Kaplan-Meier (KM) estimated median intracranial DoR was 9.4 mo (95% CI, 3.7, not reached [NR]) in arm A and 16.6 mo (3.7, NR) in arm B. With long-term follow-up, no new safety signals were identified. Treatment-emergent adverse events led to dose interruption (Phase 1/2: 59%; ALTA arm A/B: 49%/61%), dose reduction (13%; 8%/33%), or discontinuation (10%; 4%/13%). Conclusions: BRG showed sustained long-term activity, PFS, and manageable safety in pts with CRZ-refractory ALK+ NSCLC. The 180 mg/d dose after 7-d lead-in at 90 mg/d led to numerically higher median PFS and OS. Final results are similar to those reported for other approved ALK tyrosine kinase inhibitors in this setting. Clinical trial information: NCT01449461, NCT02094573. [Table: see text]


Sign in / Sign up

Export Citation Format

Share Document